Study shows Tesla Autopilot has been involved in hundreds of crashes and dozens of investigations

Tesla’s driver-assistance technology has been sold as a glimpse of a safer, automated future, yet the crash record tied to Autopilot and related systems tells a more complicated story. You are now confronted with a technology that is marketed as making driving easier and safer, even as federal data and legal filings link it to hundreds of collisions and dozens of investigations into deaths and serious injuries.

As regulators widen their probes and Tesla revises both its software and its business model, you are left to navigate a landscape where the same system can be described as “nine times safer than humans” and, in the next breath, as a uniquely risky experiment on public roads. Understanding how those claims coexist is no longer optional if you share the road with these vehicles.

The scale of Autopilot crashes and deaths

You cannot assess Tesla’s promises without first grappling with the sheer number of serious incidents tied to its driver-assistance features. A detailed federal analysis of crash data found that vehicles guided by Autopilot had been involved in more than 700 crashes, a tally that cuts through marketing language and shows how often the system is implicated when things go wrong. Separate legal reporting from FREEMONT, CALIFORNIA describes Autopilot Involved In 17 Fatalities and 736 Serious Collisions, underscoring that the toll is not limited to minor fender benders. When you hear that Autopilot Involved In Fatalities and Serious Collisions is a standing phrase in legal complaints, it becomes clear that this is not a fringe concern.

Those headline figures are reinforced by independent legal and advocacy work that tracks deaths linked to Tesla systems. One plaintiffs’ firm warns that the statistics surrounding Tesla’s Autopilot raise serious questions about the reliability of Tesla’s Autopilot technology, arguing that the public record likely understates the true number of deaths. A separate data project invites you to Read extensive analysis on Tesla crashes and fatalities, compiling incidents in which Autopilot or related features were reportedly active. Taken together, these sources show that when you activate Autopilot, you are engaging with a system that has already been linked to hundreds of serious crashes and a growing list of fatalities.

Inside the federal investigations

Regulators have responded to this pattern with a series of escalating probes that directly affect how you should think about using these systems. A federal safety report on Tesla Autopilot concluded that the system lacked standard protections that other driver-assistance packages use to keep motorists engaged, and government engineers tied those design choices to crashes and deaths. U.S. regulators have also launched a sweeping inquiry into nearly 2.9 million Tesla cars equipped with Full Self-Driving, with officials noting that they have been investigating a series of crashes, including one in which a pedestrian was killed, as part of a broader federal probe into traffic violations and safety defects.

The regulatory pressure has only intensified. One inquiry described as Tesla’s Self Driving Woes Deepen has seen the NHTSA Targets 2.8M vehicles, a sign that the agency is no longer treating Autopilot and Full Self-Driving as niche products but as mass-market systems with systemic risks, according to regulatory filings. In parallel, Tesla Granted More Time in a U.S. Investigation Into Its Self Driving Tech, with officials scrutinizing how the software handles crashes with other vehicles and injuries, as described in a federal notice. Another report notes that Austinbased Tesla has been given a five-week extension to answer detailed questions in a federal investigation into its full supervised driving features, highlighting how deeply regulators are now probing the company’s engineering and safety culture in public briefings.

What Tesla’s own numbers say about safety

Against this backdrop, Tesla has tried to reassure you with its own data, arguing that Autopilot dramatically reduces crash risk. In its Vehicle Safety Report for Q3 2025, Tesla’s new Safety Report shows Autopilot is nine times safer than humans, presenting figures that compare miles driven per crash when drivers were using Autopilot technology to those driving without it, according to Tesla’s Safety Report. A separate breakdown of the same Vehicle Safety Report for Q3 2025 emphasizes that drivers were using Autopilot technology for long highway stretches, where crash rates are typically lower, which is central to Tesla’s claim that its system is nine times safer than humans, as detailed in the company’s Vehicle Safety Report. Another analysis of Tesla Drops Stunning Autopilot Safety Data frames this as Tesla Says Autopilot Is Safer, By a Lot, pointing to internal metrics that show Autopilot outperforming the average human driver by a wide margin in miles between crashes, according to company data.

Yet even Tesla’s own numbers are more nuanced than the marketing gloss suggests, and you should read them carefully. One Crash Every 6.36 Million Miles is described as The Core Metric in Tesla’s Q3 2025 Vehicle Safety Report, which documents that Autopilot averaged 6.36 M miles between crashes, compared with far shorter intervals for human drivers, according to the Vehicle Safety Report. Tesla has also revealed that in Q3 2025, they recorded one crash for every 6.36 m miles driven with Autopilot, while crashes without the system active occur roughly every 702,000 miles, according to Tesla’s reports. Another community summary notes that Tesla has shared its second-quarter 2025 vehicle safety report, with clear improvements linked to Autopilot and Full Self features and similar comparisons of miles between crashes, as described in a club briefing. At the same time, a critical review of Tesla’s own data confirms Autopilot safety regressed in 2025, highlighting a Metric of miles between crashes that worsened year over year even as Tesla expanded the system to more drivers, according to an independent analysis. For you, the takeaway is that Tesla’s own figures can both support claims of relative safety and reveal troubling backsliding as the technology scales.

Design flaws, human behavior, and real-world crashes

Numbers alone do not explain why Autopilot has been tied to so many severe crashes, and this is where design choices and human behavior collide. A federal investigative report into crashes and deaths associated with Tesla Autopilot found that the system lacked standard protections to ensure drivers stayed attentive, and government engineers concluded that this contributed to serious collisions, according to a detailed technical review. One widely discussed crash involved a driver named Banner, whose car underrode a semi-trailer crossing the highway; Ten seconds later, his car had passed under the trailer, and investigators noted that Banner had his hands off the wheel for the final eight seconds, raising questions about how the system monitored engagement and whether the recall was actually effective, according to a crash reconstruction. When you see how quickly a momentary lapse can turn fatal, the gap between Autopilot’s branding and its real-world demands becomes stark.

Regulators have also zeroed in on how Tesla markets and deploys its more advanced Full Self features. One enforcement action notes that Late last year, California regulators found that Tesla had engaged in deceptive marketing and false advertising around their vehicles’ autonomous capabilities, even as the company pushed ahead with robotaxis in Austin, according to a regulatory summary. Another policy shift shows how Tesla Drops Free Autopilot and pushes new owners toward FSD Supervised, ending the option to buy FSD (Supervised) through a one-time $8,000 transaction After Feb 14, 2026, and drawing criticism over the misleading “Autopilot” name, according to product disclosures. When you combine permissive branding, limited driver monitoring, and complex edge cases like crossing trucks or pedestrians, the result is a system that can lull you into overtrusting automation that still requires constant vigilance.

A turning point for Tesla and for you

More from Fast Lane Only

Bobby Clark Avatar