Autopilot Never Designed to Avoid Scenario of Fatal Accident

Mobileye said its technology that powers Tesla Autopilot wasn't designed to handle the scenario that led to the first fatal accident involving a self-driving car.


If you need any further evidence that Tesla was irresponsible with the release of its Autopilot semi-autonomous driving system, apparently the technology was never designed to avoid the type of situation that resulted in 40-year-old Joshua Brown losing his life on May 7, 2016 in Williston, Fla.

Mobileye is an Israeli company that builds the vision-based driver assistance system. In a statement, Mobileye said its technology is designed specifically to avoid rear-end collisions, not what led to the fatal crash.

Video: Aftermath of Deadly Tesla Crash

“We have read the account of what happened in this case. Today’s collision avoidance technology, or Automatic Emergency Braking (AEB), is defined as rear-end collision avoidance, and is designed specifically for that. This [fatal] incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.”

Must-Read: 10 Epic Self-Driving Car Trips

The National Highway Traffic Safety Administration (NHTSA) said preliminary reports indicated that the crash occurred when a tractor-trailer made a left turn in front of the Tesla, and the car failed to apply the brakes. Tesla said “neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied.”

This is the first known fatal accident involving a self-driving car, casting some doubt on whether autonomous vehicles can consistently make split-second, life-or-death driving decisions on the highway.

In May 2016, just days after testifying to Congress that self-driving cars aren’t ready for mass deployment, Missy Cummings joined The Robotics Trends Show to discuss how irresponsible Tesla had been by releasing Autopilot to the public.

The Associated Press reports that a DVD player was found in the wreckage of the fatal crash. The driver of the tractor trailer said Brown was watching Harry Potter at the time of the crash. The AP didn’t confirm that.

But if Autopilot can’t see and avoid vehicles crossing laterally in front of it, there’s no way it’s ready for the roads.

Diagram of Fatal Tesla CrashA diagram from the police report about the Tesla crash shows how the vehicle in self-driving mode (V02) struck a tractor-trailer (V01) as it was turning left. (Credit Florida Highway Patrol via New York Times)

[Source:] TechCrunch




About the Author

Steve Crowe · Steve Crowe is managing editor of Robotics Trends. Steve has been writing about technology since 2008. He lives in Belchertown, MA with his wife and daughter.
Contact Steve Crowe: scrowe@ehpub.com  ·  View More by Steve Crowe.




Comments

Totally_Lost · July 2, 2016 · 2:22 am

The mistake of both the driver and autopilot, was not defensively recognizing the semi setting up for the turn, and setting up a defensive strategy for the truck driver not seeing the on coming car. The software did not do this ... it’s sensors probably were not usable at the 100-500 yard range where this would have been visible for an alert driver. The software was in control, the driver delegated defensive driving to the autopilot, and wilfully became distracted. Tesla’s choice to allow the driver to delegate defensive driving, is the critical choice that killed this driver. Tesla did not provide a passive safety system that requires that driver to remain actively in control with defensive driving skills, but rather decide to sell a product which purposefully creates the allusion of a safe alternative.

Totally_Lost · July 2, 2016 · 1:56 am

The mistake is taking the driver out of an active defensive driving mode/state. Safe is passively active like ABS and traction control, something that helps the driver not make mistakes. Anything that promotes the driver NOT being actively in control, WILL create these kind of deaths, and MORE often than if the driver was forced to actually drive. There are lessons that are critical ... http://home.iwichita.com/rh1/eddy/Safe_Airplane_NOT.htm


Totally_Lost · July 2, 2016 at 1:56 am

The mistake is taking the driver out of an active defensive driving mode/state. Safe is passively active like ABS and traction control, something that helps the driver not make mistakes. Anything that promotes the driver NOT being actively in control, WILL create these kind of deaths, and MORE often than if the driver was forced to actually drive. There are lessons that are critical ... http://home.iwichita.com/rh1/eddy/Safe_Airplane_NOT.htm

Totally_Lost · July 2, 2016 at 2:22 am

The mistake of both the driver and autopilot, was not defensively recognizing the semi setting up for the turn, and setting up a defensive strategy for the truck driver not seeing the on coming car. The software did not do this ... it’s sensors probably were not usable at the 100-500 yard range where this would have been visible for an alert driver. The software was in control, the driver delegated defensive driving to the autopilot, and wilfully became distracted. Tesla’s choice to allow the driver to delegate defensive driving, is the critical choice that killed this driver. Tesla did not provide a passive safety system that requires that driver to remain actively in control with defensive driving skills, but rather decide to sell a product which purposefully creates the allusion of a safe alternative.


Log in to leave a Comment



Editors’ Picks

Autonomous Snake-like Robot to Support Search-and-Rescue
Worcester Polytechnic Institute is creating autonomous snake-like robots that can navigate through...

Love Writing About Robotics and AI? Robotics Trends is Hiring!
Robotics Trends and sister site Robotics Business Review are growing and adding...

WiBotic PowerPad Wirelessly Charges Drones
WiBotic’s PowerPad wirelessly charges everything from large industrial drones to smaller...

Meet Jing Xiao: WPI’s New Director of Robotics
In January 2018, Jing Xiao will become the new director of the Robotics...