Leaked data reveals: Tesla concealed Autopilot accidents that resulted in fatalities

In the US, Tesla is alleged to have failed to fully disclose serious accidents involving its Autopilot system for years, even though these very incidents were crucial for assessing the system’s risks in real-world use. This renewed criticism stems from a massive data leak that exposed thousands of internal processes and customer complaints. The documents suggest that Tesla used its vehicles on public roads as a testing ground, even though the system sometimes accelerated abruptly, braked for no reason, or misinterpreted obstacles. The consequences were severe: more than 1,000 accidents, serious injuries, fatalities among occupants and bystanders, and mounting pressure from courts and US authorities. (srf: 20.04.26)


Concealed Accidents Provided the Most Important Warning Signs

The data leak originated with an insider who came forward at the end of 2022. The leaked files contain thousands of customer complaints. More than 2,400 reports concern sudden acceleration, and the documents also list more than 1,000 accidents. In many cases, however, the status remained “unresolved.”

Leaked data reveals that Tesla concealed Autopilot accidents. Fatalities and serious crashes were also not fully reported to authorities.
Leaked data reveals that Tesla concealed Autopilot accidents. Fatalities and serious crashes were also not fully reported to authorities.

These accidents were crucial for the safety assessment. They revealed not theoretical weaknesses, but concrete errors in real-world traffic. If a manufacturer fails to disclose or downplays such incidents, authorities, courts, and buyers lack the very data that can be a matter of life and death. The accusation against Tesla, therefore, is not only technical deficiencies, but also systematic opacity regarding particularly critical incidents.

Tesla used public roads as a testing ground

According to SRF and RTS, Tesla used normal road traffic as a development environment for its autonomous driving system. This quickly provided the company with vast amounts of data. At the same time, Tesla shifted the risks onto customers and other road users. Anyone driving alongside such a vehicle became part of a test without their consent.

Particularly problematic were malfunctions described as “hallucinations” in artificial intelligence. Vehicles would accelerate or brake without any apparent reason. Furthermore, the system could misinterpret its surroundings. At high speeds, such errors immediately became life-threatening because the car detected obstacles but did not necessarily react in time.


Verdict, Investigations, and Allegations of a Deliberate Cover-Up

Last summer, Tesla was ordered to pay more than $240 million in damages in the US. The lawsuit was brought by the family of 22-year-old Naibel Benavides, who died when a Tesla in Autopilot mode crashed into her partner’s car. A federal judge upheld the verdict at the end of February 2026. Tesla can still appeal.

One point in particular weighed heavily in the proceedings. When investigators requested the accident data from the black box, Tesla reportedly claimed it was damaged. However, experts from the victims’ lawyers recovered the deleted information. This data allegedly showed that Tesla was aware of the malfunction on the evening of the accident. The car had detected the obstacles but did nothing to prevent the collision. One survivor said, “When I found out, I felt like a guinea pig.”

As a result, several investigations are underway against Tesla in the US. The Ministry of Justice is investigating whether the company misled consumers, and the national road safety authority is also conducting inquiries. According to SRF, whistleblowers described a company that prioritized rapid development over safety. If this allegation is further substantiated, the issue will no longer be solely about faulty software. The question then arises whether Tesla concealed dangerous accidents in order to continue testing its Autopilot on the road despite known risks.

Scroll to Top