Tech Experts State The Obvious: 'Someone Is Going To Die’ In A Self-Driving Car

While Google and other companies work to improve autonomous vehicle technologies, legislatures and lawmakers are busy adopting rules and regulations to ensure that testing self-driving cars is a safe endeavor. Be that as it may, the one inescapable reality is that at some point, there's going to be a fatality involved.

"There is no question that someone is going to die in this technology," Missy Cummings, a roboticist and associate professor at Duke University, stated in testimony to the U.S. Senate committee on commerce, science, and transportation. "The question is when and what can we do to minimize that."

Google Self-Driving Car

Lawmakers are intent on laying out a set of universal standards for self-driving cars, though automakers and others involved with autonomous vehicles argue that a bunch of rules would only slow their efforts. We saw this play out late last year when California's Department of Motor Vehicles drafted a set of rules that set several restrictions on the use of self-driving cars, including the requirement that a licensed driver be behind the wheel at all times.

"In developing vehicles that can take anyone from A to B at the push of a button, we’re hoping to transform mobility for millions of people, whether by reducing the 94 percent of accidents caused by human error or bringing everyday destinations within reach of those who might otherwise be excluded by their inability to drive a car," Google spokesman Johnny Luu stated at the time.

Google ultimately won that argument when the National Highway Traffic Safety Administration (NHTSA) two months later determined that the artificial intelligence system controlling its fleet of autonomous vehicles qualifies as a "driver" under federal law.

Self-Driving Car in Public

As the technology for autonomous vehicles advances, the problems to be solved become more complex. Google's director of self-driving cars, Chris Urmson, noted that there are some morally difficult challenges to solve, such as deciding where a self-driving car should turn when all options are grave—in the path of a child playing in the road or swerve and fling itself (and its passenger) over an overpass.

There are other challenges with rushing to test self-driving cars on public byways.

"We know that people, including bicyclists, pedestrians and other drivers, could and will attempt to game self-driving cars, in effect trying to elicit or prevent various behaviors in attempts to get ahead of the cars or simply to have fun," Cummings said.

Self-driving cars up to this point have proven remarkably safe after millions of miles of testing. Up until recently, autonomous vehicles were never deemed at fault for the limited number of accidents they've been involved in, which mostly entailed being rear-ended at low speeds. However, the spotless at-fault record for self-driving cars was ruined when one of Google's autonomous cars hit a bus when attempting a lane change.