Has Tesla Lapped Google in Self-Driving Car Race?

Tesla customers have driven more than 100 million on Autopilot. That's a lot of self-driving miles, but is it enough to give Tesla the edge over Google in self-driving car development?


Reports from Tesla suggest it is gathering huge amounts of driving data from logs in its cars - 780 million miles of driving, and as much as 100 million miles in Autopilot mode. This contrasts with the 1.6 million miles of test operations at Google.

Huge numbers, but what do they mean now, and in the future?

As I’ve written before, testing is one of the biggest remaining challenges in self-driving car development - how do you prove to yourself and to others that you’ve reached the desired safety goals? Tons of miles are a very important component to that. If car companies are able to get their customers to do the testing for them, that can be a big advantage. (As I wrote last week, another group that can get others to do testing are companies like Uber and even operators of large commercial and taxi fleets.) Lots of miles mean lots of testing, lots of learning, and lots of data.

Does Tesla’s quick acquisition of so many miles mean they have lapped Google? The short answer is no, but it suggests a significant threat since Google is, for now, limited to testing with its small fleet and team of professional testing drivers.

Tesla is collecting vastly less data from its cars than Google does. First of all, the Tesla has a lot fewer sensors and no LIDAR, and according to various sources I have spoken to, Tesla is only collecting a fraction of what its sensors gather. To collect all that it gathers would be a huge data volume, not one you would send over the cell network, and even over the Wi-Fi at home it would be very noticeable. Instead, reports suggest Tesla is gathering only data on incidents and road features the car did not expect or did not handle well. However, nothing stops Tesla in the future from logging more, though the company might want to get approval from owners to use all that bandwidth.

Dateline NBC reports on self-driving cars, including an interview with Brad Templeton. (scroll forward to second segment)

Tesla wants to make a self-driving car for people to buy today. As such, it has no LIDAR, because a car today, and even the Autopilot, can be done without LIDAR. Tomorrow’s LIDARs will be cheap but today’s production LIDARs for cars are simple and/or expensive. So while the real production door-to-door self-driving car almost certainly uses LIDAR, Tesla is unable and unwilling to test and develop with it. (Of course, they can also argue that in a few years, neural networks will be good enough to eliminate the need for LIDAR. That’s not impossible, but it’s a risky bet. The first cars must be built in a safety-obsessed way, and you’re not going to release the car less safe than you could have made it just to save what will be only a few hundred dollars of cost.)

As noted, Google has being doing its driving with professional safety drivers, who are also recording a lot of data from the human perspective that ordinary drivers never will. That isn’t 100 times better but it’s pretty important.

Tesla is also taking a risk, and this has shown up in a few crashes. Their customers are beta testing a product that’s not yet fully safe. In fact, it was a pretty bold move to do this, and it’s less likely that the big car companies would have turned their customers into beta testers - at least no until forced by Tesla. If they do, then the big automakers have even more customers than Tesla, and they can rack up even more miles of testing and data gathering.

When it comes to training neural networks, ordinary drivers can provide a lot of useful data. That’s why Commma.ai, which I wrote about earlier, is even asking volunteers to put a smartphone on their dash facing out to get them more training data. At present, this app does not do much, but it will not be hard to make one that offers things like forward collision warning and lane departure warning for free, paid for by the data it gathers.

This article was republished with permission from Brad Templeton’s Robocars Blog.




About the Author

Brad Templeton · Brad Templeton is a developer of and commentator on self-driving cars. He writes and researches the future of automated transportation at Robocars.com.
Contact Brad Templeton: 4brad@templetons.com  ·  View More by Brad Templeton.
Follow Brad on Twitter.



Comments

Totally_Lost · July 1, 2016 · 2:30 pm

I had seen the assertion of sun behind the trailer ... that seems to have been just a bright sky behind a white trailer. Either way, both driver and Tesla did not react responsively defensive by seeing the truck-trailers actions before the turn was started, or defensively braking once the truck started the turn.

Totally_Lost · July 1, 2016 · 2:25 pm

I will also double down on my assertion that fully automated drone air transportation is SIGNIFICANTLY safer than drone ground transportation for these reasons. Congestion in the air is significantly less than ground transportation where ALL traffic is compressed into narrow roadways. Skip drone ground transportation, and jump directly to drone air transportation, and safe society the huge costs of ground transportation at all levels, both personal and governmental .... and save tens of thousands of lives each year.

Totally_Lost · July 1, 2016 · 2:20 pm

The issue here was a false sense of security in a blinding condition for the driver, that turned out the cars sensors were also blind, and unable to assist the driver. A tractor/trailer (semi) turned in front of the Tesla, with the sun behind it ... the driver wasn’t fully alert, and the cars sensors were also blinded. The Tesla went under the semi-trailer, and ripped the top off, killing the driver. The car ended up several hundred feet the other side of the trailer. Driver was dead at scene when authorities arrived.

So the false sell in this technology is that the automated system is “safer than a human driver” ... and that false sense of security killed the driver that didn’t remain alert with a defensive driving posture.

Steve Crowe · July 1, 2016 · 10:29 am

Also, check out this podcast with Missy Cummings from March 2016. We talked a lot about the irresponsibility of Tesla and what would happen if a fatal accident happened. I’ll be reposting this podcast today due to its relevance.

http://www.roboticstrends.com/article/missy_cummings_self_driving_cars_need_to_happen

Steve Crowe · July 1, 2016 · 10:18 am

Check out our coverage of this here:

http://www.roboticstrends.com/article/tesla_autopilot_involved_in_first_fatal_self_driving_car_accident

I agree Tesla Autopilot is a mistake. People are trusting the technology too much. To Tesla’s defense, they’ve expressed over and over and over again that humans need to keep their hands on the wheel while Autopilot’s engaged. But Tesla should also know that humans won’t adhere to these warnings.

Sad situation all around. Not sure anything could have helped Joshua Brown if the tractor-trailer was in the wrong.

Totally_Lost · June 30, 2016 · 7:28 pm

I’ve been pretty vocal that this is a mistake, and people really need to slow down and take these cars back to test tracks to debug corner conditions that can be collected from highly sensor/logged human driven cars. The first person actually died because they trusted this tech ... more to come if this foolishness isn’t brought back to reality.

https://finance.yahoo.com/news/u-opens-investigation-fatal-crash-tesla-203615260—finance.html


Totally_Lost · June 30, 2016 at 7:28 pm

I’ve been pretty vocal that this is a mistake, and people really need to slow down and take these cars back to test tracks to debug corner conditions that can be collected from highly sensor/logged human driven cars. The first person actually died because they trusted this tech ... more to come if this foolishness isn’t brought back to reality.

https://finance.yahoo.com/news/u-opens-investigation-fatal-crash-tesla-203615260—finance.html

Steve Crowe · July 1, 2016 at 10:18 am

Check out our coverage of this here:

http://www.roboticstrends.com/article/tesla_autopilot_involved_in_first_fatal_self_driving_car_accident

I agree Tesla Autopilot is a mistake. People are trusting the technology too much. To Tesla’s defense, they’ve expressed over and over and over again that humans need to keep their hands on the wheel while Autopilot’s engaged. But Tesla should also know that humans won’t adhere to these warnings.

Sad situation all around. Not sure anything could have helped Joshua Brown if the tractor-trailer was in the wrong.

Steve Crowe · July 1, 2016 at 10:29 am

Also, check out this podcast with Missy Cummings from March 2016. We talked a lot about the irresponsibility of Tesla and what would happen if a fatal accident happened. I’ll be reposting this podcast today due to its relevance.

http://www.roboticstrends.com/article/missy_cummings_self_driving_cars_need_to_happen

Totally_Lost · July 1, 2016 at 2:20 pm

The issue here was a false sense of security in a blinding condition for the driver, that turned out the cars sensors were also blind, and unable to assist the driver. A tractor/trailer (semi) turned in front of the Tesla, with the sun behind it ... the driver wasn’t fully alert, and the cars sensors were also blinded. The Tesla went under the semi-trailer, and ripped the top off, killing the driver. The car ended up several hundred feet the other side of the trailer. Driver was dead at scene when authorities arrived.

So the false sell in this technology is that the automated system is “safer than a human driver” ... and that false sense of security killed the driver that didn’t remain alert with a defensive driving posture.

Totally_Lost · July 1, 2016 at 2:25 pm

I will also double down on my assertion that fully automated drone air transportation is SIGNIFICANTLY safer than drone ground transportation for these reasons. Congestion in the air is significantly less than ground transportation where ALL traffic is compressed into narrow roadways. Skip drone ground transportation, and jump directly to drone air transportation, and safe society the huge costs of ground transportation at all levels, both personal and governmental .... and save tens of thousands of lives each year.

Totally_Lost · July 1, 2016 at 2:30 pm

I had seen the assertion of sun behind the trailer ... that seems to have been just a bright sky behind a white trailer. Either way, both driver and Tesla did not react responsively defensive by seeing the truck-trailers actions before the turn was started, or defensively braking once the truck started the turn.


Log in to leave a Comment



Editors’ Picks

CES 2018 AI Conference Schedule
Robotics Trends' AI conference at CES 2018 examines recent developments, current applications, and...

Unibo Robot Stars in Fujitsu AI Cloud Platform
Unibo can recognize users and customize conversations accordingly. Unibo can move its...

Jibo Music Brings iHeartRadio to Social Robot
ibo and iHeartRadio have teamed up to launch Jibo Music that will...

Japanese Startup GROOVE X Goes Viral as Teaser for LOVOT Robot
GROOVE X is teasing its LOVOT companion robots that are scheduled to...