Federal Regulators Investigate 2.9 Million Tesla Vehicles for FSD Red-Light Failures

Federal safety regulators are now investigating nearly 2.9 million Tesla vehicles after receiving multiple reports that the cars failed to stop at red lights while using the company’s “Full Self-Driving” (FSD) software. The National Highway Traffic Safety Administration (NHTSA) opened this extensive preliminary evaluation to determine whether Tesla’s driver-assistance system is exhibiting dangerous and unpredictable behavior that violates basic traffic laws.

The probe raises significant concerns about Tesla’s approach to deploying advanced software systems before they are fully proven safe in real-world conditions. Tesla recently rebranded the feature as “FSD (Supervised),” but this adjustment appears to be more of a marketing shift than a technical correction. Despite the name change, the system continues to operate as a Level 2 driver-assistance feature, meaning the driver must maintain full attention even when the car performs steering, braking, and acceleration.

The models under investigation include the 2016–2025 Model S, 2016–2025 Model X, 2017–2026 Model 3, 2020–2026 Model Y, and the 2023–2026 Cybertruck. These models represent nearly Tesla’s entire modern lineup, highlighting the scale of potential risk if FSD is malfunctioning on a system-wide level.

Sudden and Unpredictable Behavior Creates Immediate Safety Concerns

NHTSA launched the review after receiving at least 18 direct complaints and one media report describing Teslas rolling through red lights, failing to stop completely, or misinterpreting traffic-light colors while FSD was active. These incidents reveal failures in one of the most basic and essential driving functions: stopping at controlled intersections.

Six crash reports have also been submitted under NHTSA’s Standing General Order system, which requires automakers to disclose serious incidents involving automated systems. Four of those crashes resulted in confirmed injuries. These are not minor lapses—they involve circumstances where the car’s decisions directly exposed drivers, passengers, pedestrians, and other motorists to harm.

Drivers reported that the vehicles gave little or no warning before making incorrect or unsafe maneuvers. In several cases, owners said the Tesla began to accelerate or continue through an intersection despite a fully red traffic signal, leaving the driver almost no time to override the system. Such behavior is particularly alarming given Tesla’s marketing claims that FSD improves safety and reduces driver workload. 

Repeated Failures at the Same Intersection Raise Red Flags

Investigators are focusing heavily on a cluster of incidents occurring at the same intersection in Joppa, Maryland. The fact that multiple events took place in the same location suggests that the system is not merely making isolated errors—it may be misinterpreting specific traffic-control environments entirely.

According to regulators, Tesla has already pushed a software modification to address behavior at this location, an indication that the company acknowledges the issue at some level. Still, a single patch does not explain why the system behaved incorrectly in the first place or whether similar problems may exist at countless other intersections nationwide.

Beyond traffic-light recognition, regulators are assessing a second pattern of concerning behavior. Drivers report that their Teslas drifted across double-yellow lines, entered oncoming lanes, or made improper turning maneuvers despite clear signage and lane markings. These events suggest deeper shortcomings in the vehicle’s lane interpretation, environment detection, or decision-making algorithms.

What concerns regulators most is the consistency of the failures. When an automated system repeatedly makes the same mistake in the same environment, it points to a fundamental flaw in how the vehicle interprets visual inputs such as signals, lights, and road geometry. Engineers warn that misreading a single intersection often indicates larger mapping or software logic problems that could appear anywhere similar conditions exist. If Tesla’s system struggles in one location, there is a real risk that thousands of intersections across the country could trigger the same dangerous response patterns. This raises the stakes for both regulators and owners, as the potential scope of error extends far beyond one isolated community.

Additional Complaints Show a Broader Pattern of Failure

In total, NHTSA’s Office of Defects Investigation has documented 24 complaints, six additional crash reports, and three media accounts describing hazardous behavior while FSD was active. These narratives often depict vehicles veering into the wrong lanes, entering intersections unlawfully, or failing to adhere to standard traffic rules that even inexperienced human drivers understand intuitively.

Some drivers said that their Tesla provided no audible alerts or steering resistance before executing the maneuver, meaning the software’s mistakes happened too quickly for the human operator to correct. Given Tesla’s insistence that FSD requires full driver supervision, these sudden movements undermine the premise that the driver can safely intervene. If the system gives little to no warning of its intentions, the responsibility unfairly shifts to consumers who must react within split seconds to prevent collisions.

This investigation will evaluate how the system detects traffic lights, interprets lane boundaries, and decides when to execute turns. Regulators will also determine whether Tesla’s software updates have improved performance or merely obscured underlying issues.

Why Tesla’s FSD Approach Raises Long-Term Safety Concerns

While advanced driver-assistance technologies are becoming increasingly common, Tesla’s approach to public beta-testing its “Full Self-Driving” software has been widely criticized by safety experts. Unlike traditional automakers that conduct extensive controlled testing before releasing driver-assistance features, Tesla often pushes updates directly to consumer vehicles and relies on real-world drivers—many with families in the car—to uncover errors.

This model places significant risk on consumers, who may not fully understand the system’s limitations or the dangers of overreliance on automation. By naming the feature “Full Self-Driving,” Tesla created expectations that the vehicle could handle complex scenarios despite the system remaining fundamentally incomplete and requiring constant supervision.

The current probe highlights how software misinterpretation, faulty environmental detection, or insufficiently tested algorithms can lead to dangerous outcomes. As vehicles become more software-dependent, regulators and consumers are increasingly questioning whether Tesla prioritizes rapid innovation over rigorous safety protocols. These concerns grow even stronger as more data surfaces showing recurring issues, injuries, and avoidable accidents tied to incomplete or inconsistent automated driving features.

The Investigation Could Expand and Lead to Further Regulatory Action

This probe joins two additional ongoing NHTSA investigations involving Tesla’s driver-assistance technology, including one tied to a fatal crash in 2024. Together, these inquiries paint a troubling picture of a system that may not reliably adhere to safety expectations despite its widespread deployment across millions of vehicles.

If investigators determine that FSD routinely fails to comply with basic traffic laws, Tesla could face substantial regulatory action, recalls, or requirements to modify or disable certain features. Regulators may also assess whether Tesla misled consumers by overstating the system’s capabilities or implying that the software was closer to full autonomy than it truly is.

As Tesla continues to roll out updates, the question becomes whether these changes genuinely resolve underlying issues or simply attempt to patch visible symptoms. Consumers deserve transparency and clarity, especially when software errors can result in catastrophic outcomes on public roads.

How The Barry Law Firm Can Help

At The Barry Law Firm, we specialize in helping California consumers take legal action against manufacturers when their vehicles fail to meet quality and safety standards. If your Tesla vehicle has been in the shop repeatedly for mechanical or safety issues — or if a recall repair hasn’t fixed the problem — you may be entitled to a refund, replacement, or cash compensation under California’s Lemon Law.

Lemon Law Expertise – We specialize in California Lemon Law cases and know how to hold manufacturers accountable.
No Upfront Costs – The California Lemon Law requires the manufacturer to pay our fees. That means, at The Barry Law Firm, we will never charge you, no matter the outcome of your case.
Proven Success – We have helped thousands of consumers obtain favorable settlements for their defective vehicles.
Personalized Attention – We handle all legal paperwork and negotiations so you don’t have to deal with the stress.

If Tesla’s repairs have failed to solve your vehicle’s problems, you may have a Tesla FSD defect Lemon Law case. Contact us today to explore your legal options.

Closing

Tesla’s ongoing safety issues surrounding FSD technology illustrate the risks of releasing partially developed software into the hands of everyday drivers. When automated systems misread traffic lights, drift into opposing lanes, or execute dangerous maneuvers without warning, they do more than undermine trust—they put lives at stake. California drivers deserve technology that enhances their safety, not systems that introduce new hazards.

As federal investigations continue, more information will surface about whether Tesla’s software can be trusted to follow basic traffic laws and protect the people inside and outside the vehicle. Consumers who experience recurring FSD issues should document every incident and stay informed about the findings of this probe. Tesla owners must not assume that updates alone will resolve all defects, especially when so many reports highlight unpredictable or unsafe behavior.

Your safety, the safety of your family, and the safety of other California motorists should always come first. If your Tesla shows signs of malfunctioning while using FSD, you should not wait for further investigations, expanded recalls, or additional regulatory action to protect your rights. The Barry Law Firm is committed to helping California drivers stand up to automakers when software failures compromise their well-being and peace of mind.

 

 

CALL FOR A FREE CONSULTATION

877-536-6603

GET A FREE CASE CONSULTATION



The Barry Law Firm

11845 W Olympic Blvd Suite 1270

Los Angeles, California 90064

Phone: 310-684-5859

Free Consultation: 877-536-6603