Are Self-Driving Cars More or Less Crash-Prone?
Self-driving cars have a have a crash rate nearly five times that of human-driven vehicles.
Self-driving cars have a crash rate nearly five times that of human-driven vehicles, according to a new study from the University of Michigan Transportation Institute (UMTI) that examined autonomous vehicle crashes reported by Google, Delphi, and Audi, which all have licenses to operate self-driving vehicles in a number of states.
“A Preliminary Analysis of Real-World Crashes Involving Self-Driving Vehicles,” compiled by UMTI researchers Brandon Schoettle and Michael Sivak, also found that the injury rate in self-driving car accidents is about four times that of human-driven vehicles, although the severity of the injuries is less that of standard vehicles.
However, the researchers cautioned that those numbers may not tell the whole story. First, the data was pulled from 11 crashes among three makers of self-driving cars whose fleets only cumulatively drove 1.2 million miles. Second, none of the accidents were caused by the self-driving car; every crash was caused by a human driver in another vehicle.
Here are some other statistics from the study, to which you can read the abstract here:
- 73% of the crashes involving self-driving cars occurred when the car was going 5 mph or less, or when it was stopped
- 15.8% of human-driven car crashes involve a fixed object, while 14% of self-driving car crashes involve a non-fixed object like a pedestrian, for example
- 3.6% of human-driven cars accidents were head-on crashes, while self-driving cars have only been involved in rear-end collisions, side-swipes, or angled collisions
- Less than 1% of human-driven car accidents involved a fatal injury, but no self-driving car accident has had a fatal injury
- 28% of human-driven car crashes result in a non-fatal injury, while only 18.2% of self-driving car crashes result in the same kind of injury
- 81.8% of self-driving car crashes result in property damage
The study compared the self-driving cars from Audi, Delphi, and Google with 269 million conventional vehicles that had traveled 3 trillion miles across all 50 states in a mix of geographic and weather conditions.
The aforementioned statistics point to Google’s decision to make its self-driving cars drive more like humans. Chris Urmson, head of Google’s self-driving car project, recently said the company’s self-driving cars are “a little more cautious than they need to be.” And although Google says it’s never been at fault in any of its 16 minor accidents since 2009, some are starting to believe this over-cautious nature is the reason why its self-driving cars have been rear-ended 12 times.
Schoettle and Sivak also recently proposed that self-driving cars need to pass driving tests. Humans need to pass driving tests to get a license, so why not robots? Rain, snow, and darkness make it more difficult for self-driving cars to recognize potential hazards, the researchers write, so the cars should be tested under a variety of weather conditions.
The researchers propose a graduated license system to fix the weather problem. For example, a certain model could be licensed to drive in good weather, but not in the snow. And another model, which has a problem seeing at night, could be licensed to only drive during the day.
Schoettle and Sivak don’t think every single self-driving car needs to be tested, but they say the new models that come out should be put to the test.