Skip to content
Ouch!

Mobileye spills the beans: Tesla was dropped because of safety concerns

Now we know who ended the relationship.

Jonathan M. Gitlin | 204
A hirsute Sebastian using autopilot in a Tesla Model S. Credit: Sebastian Anthony
A hirsute Sebastian using autopilot in a Tesla Model S. Credit: Sebastian Anthony

On Wednesday, Mobileye revealed that it ended its relationship with Tesla because "it was pushing the envelope in terms of safety." Mobileye's CTO and co-founder Amnon Shashua told Reuters that the electric vehicle maker was using his company's machine vision sensor system in applications for which it had not been designed.

"No matter how you spin it, (Autopilot) is not designed for that. It is a driver assistance system and not a driverless system," Shashua said.

In a statement to Reuters, Tesla said that it has "continuously educated customers on the use of the features, reminding them that they’re responsible to keep their hands on the wheel and remain alert and present when using Autopilot" and that the system has never been described as autonomous or self-driving. (This statement appears to be at odds with statements made by Musk at shareholder meetings.)

Ars Video

 

It is also emerging that the crash which cost Joshua Brown his life in May of this year was unlikely to have been the first such fatal crash involving Tesla's Autopilot. In January of this year in China, a Tesla ploughed into the back of a stationary truck at speed, killing the driver.

Should that incident prove to be related to Autopilot, that would significantly undermine Elon Musk's claims on Sunday regarding the safety of the system. During a press conference announcing new features in Tesla's upcoming firmware version 8, Musk stated that Tesla's Autopilot "compares favorably" with human drivers, who on average suffer a fatality every 70 million-80 million miles driven globally. (The number of self-driven miles covered up to Brown's crash have been disputed.)

But even if Tesla EVs had covered 94 million Autopilot-driven miles by the time of Brown's crash, that would mean the system had actually resulted in a fatality every 47.5 million miles (which perhaps underlines the problematic nature of inferring statistical certainty from events with a very low n). In April of this year, the RAND Corporation published a study that looked at how many miles an autonomous vehicle fleet would have to cover before it could be said to be safe with sufficient statistical confidence.

"To demonstrate that fully autonomous vehicles have a fatality rate of 1.09 fatalities per 100 million miles (R=99.9999989%) with a C=95% confidence level, the vehicles would have to be driven 275 million failure-free miles," RAND said.

Changes in Tesla's firmware 8 include modifications to the driver alerts when using Autopilot, which will now disable itself if three audio warnings are ignored within an hour.

Listing image: Sebastian Anthony

Photo of Jonathan M. Gitlin
Jonathan M. Gitlin Automotive Editor
Jonathan is the Automotive Editor at Ars Technica. He has a BSc and PhD in Pharmacology. In 2014 he decided to indulge his lifelong passion for the car by leaving the National Human Genome Research Institute and launching Ars Technica's automotive coverage. He lives in Washington, DC.
204 Comments