Missy Cummings: Self-Driving Cars Need to Happen

The head of Duke University's robotics program weighs in on whether self-driving cars will overcome a fatal accident, the irresponsibility of Tesla's Autopilot system, and when she'll be ready to adopt an autonomous vehicle.


Editor’s Note: This episode of The Robotics Trends Show originally aired March 18, 2016, just days after Missy Cummings testified before Congress that self-driving cars aren’t ready for widespread deployment.

We are re-airing this episode in light of the first fatal self-driving car accident, which involved a Tesla Model S driving on Autopilot. One of the discussions on this podcast was the irresponsibility of Tesla releasing Autopilot to the public. Fast-forward to 7:50 of the podcast to hear the Tesla discussion and let us know your thoughts.

It might not seem like it judging by her testimony to Congress this week, but Missy Cummings is a fan of self-driving cars. Honestly. She just wants everyone to pump the brakes on all the hype.

The head of Duke University’s robotics program told Congress that self-driving cars are “absolutely not” ready for widespread deployment despite a rush to put them to put them on the road. She said their inability to handle bad weather, including rain and snow, how easy it is for hackers to take control of the GPS navigation systems, and the fact they can’t follow the directions of a police officer all indicate the automotive industry isn’t “ready for humans to be completely taken out of the driver’s seat.”

Cummings joined the Robotics Trends Show to follow up on the heated Congressional meeting between auto executives and robotics experts.

She also weighed in on whether self-driving cars will overcome a fatal accident, the irresponsibility of Tesla’s Autopilot system, and when she’ll be ready to adopt an autonomous vehicle.

Listen to the podcast using the embedded player below and share your thoughts on self-driving cars.




Listen to the podcast using the embedded player below.

Listen to this podcast


Comments

Totally_Lost · July 4, 2016 · 11:08 am

Cars kill as many people each year in the US, as do guns, but unlike guns, 20-50 times that number are seriously injured or disabled from automotive accidents at a staggering cost. Road crashes cost the U.S. $230.6 billion per year.

Pavement is even more expensive ... current study with decade old numbers: http://www.vtpi.org/tca/tca0506.pdf

From ASIRT:

Annual Global Road Crash Statistics

  Nearly 1.3 million people die in road crashes each year, on average 3,287 deaths a day.
  An additional 20-50 million are injured or disabled.
  More than half of all road traffic deaths occur among young adults ages 15-44.
  Road traffic crashes rank as the 9th leading cause of death and account for 2.2% of all deaths globally.
  Road crashes are the leading cause of death among young people ages 15-29, and the second leading cause of death worldwide among young people ages 5-14.
  Each year nearly 400,000 people under 25 die on the world’s roads, on average over 1,000 a day.
  Over 90% of all road fatalities occur in low and middle-income countries, which have less than half of the world’s vehicles.
  Road crashes cost USD $518 billion globally, costing individual countries from 1-2% of their annual GDP.
  Road crashes cost low and middle-income countries USD $65 billion annually, exceeding the total amount received in developmental assistance.
  Unless action is taken, road traffic injuries are predicted to become the fifth leading cause of death by 2030.

Annual United States Road Crash Statistics

  Over 37,000 people die in road crashes each year
  An additional 2.35 million are injured or disabled
  Over 1,600 children under 15 years of age die each year
  Nearly 8,000 people are killed in crashes involving drivers ages 16-20
  Road crashes cost the U.S. $230.6 billion per year, or an average of $820 per person
  Road crashes are the single greatest annual cause of death of healthy U.S. citizens traveling abroad

Totally_Lost · July 3, 2016 · 5:41 pm

The bottom line of Missy’s, my, and many other experts objections to operational use of this technology, is the irresponsible deployment where the driver can falsely believe the technology is assuming control of defensive driving responsibilities. There is a huge difference between semi-autonomous deployment where the technology assumes primary control, and a completely passive emergency assist like ABS, Traction Control, collision avoidance braking, and roll prevention systems which step in to protect less skilled drivers in difficult situations. Passive emergency systems REQUIRE the driver take full responsibility for driving the car, including defensive driving strategies. Active semi-autonomous systems like Tesla’s Autopilot are structured to take primary control, and with that, the driver will falsely believe it’s not necessary to drive, as the system has assumed primary responsibility for driving, including all defensive driving requirements. This is WRONG. As I said March 19th ... “there are a long list of things that can kill, .... I believe the AI will have to get better than the human sensor system and brain before this is safe.”

I’m certainly not a leading expert in this field today, but at the same time I’m one of a few hundred engineers world wide that has actually actively done development trying to solve these problems, starting with a registered team for the original 2004 DARPA Grand Challenge. 12 years later we have come a LOT farther, but still have only solved the easy 75% of the problem. The remaining 25% if deployed too early like Tesla, and a number of others, WILL CONTINUE TO KILL drivers, passengers, and innocent bystanders along our roadways.

There will be more accidents that kill the drivers, and their passengers ... but it will be the first accident that kills a pedestrian, or child, that will rightfully result in the driver being charged with either murder, or manslaughter, that will bring this technology to a halt. A case where the driver wilfully delegated defensive driving to the semi-autonomous, and allowed the car to kill by refusing to retain defensive driving responsibility with direct and purposeful abandonment of being actively in control of the car.

Steve Crowe · March 20, 2016 · 4:47 pm

Did you guys read about the bicyclist in Texas who was basically toying with a Google car?

“The car arrived at the stop line a fraction of a second before I did and had the legal right-of-way. Not wanting to unclip my shoes from the pedals, I performed a maneuver called a track-stand in which a rider comes to a stop and then simply balances the bike without putting a foot down.  As I waited for the car to proceed, I noticed that there were two occupants inside that appeared to be observers/testers.

MUST-READ: 18 Pros and Cons of Google’s Restructuring for Self-Driving Cars

“The car remained motionless for several seconds and I continued to maintain my balance without moving. As the car finally inched forward, I was forced to rock the handlebars to hold my position. Apparently, this motion was detected by one of the sensors and the car stopped abruptly. I held the bike in balance and waited for another several seconds and the cycle repeated itself … the car inched forward, I shifted my weight and re-positioned the bars and the car stopped. We did this little dance three separate times and the car was not even halfway through the intersection.  I noticed at one point that the testers were laughing and apparently typing code into a terminal to ‘teach’ the vehicle how to deal with the situation.

“The car finally rolled on and I proceeded on my ride ...”

Full story here: http://www.roboticstrends.com/article/a_cyclists_encounter_with_an_indecisive_google_self_driving_car

Bob · March 19, 2016 · 2:41 pm

I agree.  Following the directions of a policeman or other traffic director is a necessary feature to have.  A couple of years ago, I developed an application for a camera sensor that can recognize gestures of the type that are needed for traffic control.  The application was initially developed to work with the gestures used for directing aircraft, but could easily be extended to gestures used for road traffic, being similar in many ways.
It seems that another traffic direction capability would be a speech recognizer that could respond to oral directions in the case where gestures are not adequate to deal with the situation.

Totally_Lost · March 19, 2016 · 9:29 am

I think there are a long list of things that can kill, that the AI for cars will have a very difficult time at. The kid chasing a ball from a yard into the street behind parked cars. A flock of birds rising from the road side, and blinding the AI to the dangers on the other side. Deer, Elk, Moose, cats, dogs, cattle, and other large/small animals that dart across the road. Snow, rain, sleet, hail, ice, blowing leaves and sand, which leaves the roadway markings completely obscured for the AI. Roadway signs obscured by snow/mud blown onto the face of the sign. Trees blown down, or knocked down by ice storms. Accident/construction site pedestrians, crew, officers, tow truck operators in the marked roadway. Road construction areas with little to no markings. Temporary detours that are poorly marked. Cone marked areas for construction and events, with flaggers directing traffic. And there is a lot more. Enough more that I believe the AI will have to get better than the human sensor system and brain before this is safe.


Totally_Lost · March 19, 2016 at 9:29 am

I think there are a long list of things that can kill, that the AI for cars will have a very difficult time at. The kid chasing a ball from a yard into the street behind parked cars. A flock of birds rising from the road side, and blinding the AI to the dangers on the other side. Deer, Elk, Moose, cats, dogs, cattle, and other large/small animals that dart across the road. Snow, rain, sleet, hail, ice, blowing leaves and sand, which leaves the roadway markings completely obscured for the AI. Roadway signs obscured by snow/mud blown onto the face of the sign. Trees blown down, or knocked down by ice storms. Accident/construction site pedestrians, crew, officers, tow truck operators in the marked roadway. Road construction areas with little to no markings. Temporary detours that are poorly marked. Cone marked areas for construction and events, with flaggers directing traffic. And there is a lot more. Enough more that I believe the AI will have to get better than the human sensor system and brain before this is safe.

Bob · March 19, 2016 at 2:41 pm

I agree.  Following the directions of a policeman or other traffic director is a necessary feature to have.  A couple of years ago, I developed an application for a camera sensor that can recognize gestures of the type that are needed for traffic control.  The application was initially developed to work with the gestures used for directing aircraft, but could easily be extended to gestures used for road traffic, being similar in many ways.
It seems that another traffic direction capability would be a speech recognizer that could respond to oral directions in the case where gestures are not adequate to deal with the situation.

Steve Crowe · March 20, 2016 at 4:47 pm

Did you guys read about the bicyclist in Texas who was basically toying with a Google car?

“The car arrived at the stop line a fraction of a second before I did and had the legal right-of-way. Not wanting to unclip my shoes from the pedals, I performed a maneuver called a track-stand in which a rider comes to a stop and then simply balances the bike without putting a foot down.  As I waited for the car to proceed, I noticed that there were two occupants inside that appeared to be observers/testers.

MUST-READ: 18 Pros and Cons of Google’s Restructuring for Self-Driving Cars

“The car remained motionless for several seconds and I continued to maintain my balance without moving. As the car finally inched forward, I was forced to rock the handlebars to hold my position. Apparently, this motion was detected by one of the sensors and the car stopped abruptly. I held the bike in balance and waited for another several seconds and the cycle repeated itself … the car inched forward, I shifted my weight and re-positioned the bars and the car stopped. We did this little dance three separate times and the car was not even halfway through the intersection.  I noticed at one point that the testers were laughing and apparently typing code into a terminal to ‘teach’ the vehicle how to deal with the situation.

“The car finally rolled on and I proceeded on my ride ...”

Full story here: http://www.roboticstrends.com/article/a_cyclists_encounter_with_an_indecisive_google_self_driving_car

Totally_Lost · July 3, 2016 at 5:41 pm

The bottom line of Missy’s, my, and many other experts objections to operational use of this technology, is the irresponsible deployment where the driver can falsely believe the technology is assuming control of defensive driving responsibilities. There is a huge difference between semi-autonomous deployment where the technology assumes primary control, and a completely passive emergency assist like ABS, Traction Control, collision avoidance braking, and roll prevention systems which step in to protect less skilled drivers in difficult situations. Passive emergency systems REQUIRE the driver take full responsibility for driving the car, including defensive driving strategies. Active semi-autonomous systems like Tesla’s Autopilot are structured to take primary control, and with that, the driver will falsely believe it’s not necessary to drive, as the system has assumed primary responsibility for driving, including all defensive driving requirements. This is WRONG. As I said March 19th ... “there are a long list of things that can kill, .... I believe the AI will have to get better than the human sensor system and brain before this is safe.”

I’m certainly not a leading expert in this field today, but at the same time I’m one of a few hundred engineers world wide that has actually actively done development trying to solve these problems, starting with a registered team for the original 2004 DARPA Grand Challenge. 12 years later we have come a LOT farther, but still have only solved the easy 75% of the problem. The remaining 25% if deployed too early like Tesla, and a number of others, WILL CONTINUE TO KILL drivers, passengers, and innocent bystanders along our roadways.

There will be more accidents that kill the drivers, and their passengers ... but it will be the first accident that kills a pedestrian, or child, that will rightfully result in the driver being charged with either murder, or manslaughter, that will bring this technology to a halt. A case where the driver wilfully delegated defensive driving to the semi-autonomous, and allowed the car to kill by refusing to retain defensive driving responsibility with direct and purposeful abandonment of being actively in control of the car.

Totally_Lost · July 4, 2016 at 11:08 am

Cars kill as many people each year in the US, as do guns, but unlike guns, 20-50 times that number are seriously injured or disabled from automotive accidents at a staggering cost. Road crashes cost the U.S. $230.6 billion per year.

Pavement is even more expensive ... current study with decade old numbers: http://www.vtpi.org/tca/tca0506.pdf

From ASIRT:

Annual Global Road Crash Statistics

  Nearly 1.3 million people die in road crashes each year, on average 3,287 deaths a day.
  An additional 20-50 million are injured or disabled.
  More than half of all road traffic deaths occur among young adults ages 15-44.
  Road traffic crashes rank as the 9th leading cause of death and account for 2.2% of all deaths globally.
  Road crashes are the leading cause of death among young people ages 15-29, and the second leading cause of death worldwide among young people ages 5-14.
  Each year nearly 400,000 people under 25 die on the world’s roads, on average over 1,000 a day.
  Over 90% of all road fatalities occur in low and middle-income countries, which have less than half of the world’s vehicles.
  Road crashes cost USD $518 billion globally, costing individual countries from 1-2% of their annual GDP.
  Road crashes cost low and middle-income countries USD $65 billion annually, exceeding the total amount received in developmental assistance.
  Unless action is taken, road traffic injuries are predicted to become the fifth leading cause of death by 2030.

Annual United States Road Crash Statistics

  Over 37,000 people die in road crashes each year
  An additional 2.35 million are injured or disabled
  Over 1,600 children under 15 years of age die each year
  Nearly 8,000 people are killed in crashes involving drivers ages 16-20
  Road crashes cost the U.S. $230.6 billion per year, or an average of $820 per person
  Road crashes are the single greatest annual cause of death of healthy U.S. citizens traveling abroad


Log in to leave a Comment


in the Future Tech Hub

Editors’ Picks

CES 2018 AI Conference Schedule
Robotics Trends' AI conference at CES 2018 examines recent developments, current applications, and...

Jibo Music Brings iHeartRadio to Social Robot
ibo and iHeartRadio have teamed up to launch Jibo Music that will...

Japanese Startup GROOVE X Goes Viral as Teaser for LOVOT Robot
GROOVE X is teasing its LOVOT companion robots that are scheduled to...

What Humanoid Backflips Mean for Robot Agility
In just 24 months, machine agility has gone from the Keystone Kops, to...