Will Self-Driving Cars Change the Rules of the Road?

Google claims computer-navigated cars are safer than human-driven ones, but they pose a flood of new legal questions

  • Share
  • Read Later
Karen Bleier / AFP / Getty Images

A Google self-driving car maneuvers through the streets of Washington, D.C., on May 14, 2012

Not long ago, self-driving cars seemed like science fiction. But Google is now operating so-called autonomous cars in California and Nevada, and last week at the annual Consumer Electronics Show in Las Vegas, Toyota and Audi unveiled prototypes for self-driving cars to sell to ordinary car buyers. (Google co-founder Sergey Brin said last year he expects his company to have them ready for the general public within five years.) In a report backing self-driving cars, the consulting firm KPMG and the Center for Automotive Research recently predicted that driving is “on the brink of a new technological revolution.”

(MORE: Self-Driving Cars Available by 2019, Report Says)

But as the momentum for self-driving cars grows, one question is getting little attention: Should they even be legal? And if they are, how will the laws of driving have to adapt? All our rules about driving — from who pays for a speeding ticket to who is liable for a crash — are based on having a human behind the wheel. That is going to have to change.

There are some compelling reasons to support self-driving cars. Regular cars are inefficient: the average commuter spends 250 hours a year behind the wheel. They are dangerous. Car crashes are a leading cause of death for Americans ages 4 to 34 and cost some $300 billion a year. Google and other supporters believe that self-driving cars can make driving more efficient and safer by eliminating distracted driving and other human error. Google’s self-driving cars have cameras on the top to look around them and computers to do the driving. Their safety record is impressive so far. In the first 300,000 miles, Google reported that its cars had not had a single accident. Last August, one got into a minor fender bender, but Google said it occurred while someone was manually driving it.

After heavy lobbying and campaign contributions, Google persuaded California and Nevada to enact laws legalizing self-driving cars. The California law breezed through the state legislature — it passed 37-0 in the senate and 74-2 in the assembly — and other states could soon follow. The Alliance of Automobile Manufacturers, which represents big carmakers like GM and Toyota, opposed the California law, fearing it would make it too easy for carmakers and individuals to modify cars to self-drive without the careful protections built in by Google.

(MORE: Speeding into the Future: Self-Driving Cars Are Now Legal in California)

That is a reasonable concern. If we are going to have self-driving cars, the technical specifications should be quite precise. Just because your neighbor Jeb is able to jerry-rig his car to drive itself using an old PC and some fishing tackle, that does not mean he should be allowed to.

As self-driving cars become more common, there will be a flood of new legal questions. If a self-driving car gets into an accident, the human who is “co-piloting” may not be fully at fault — he may even be an injured party. Whom should someone hit by a self-driving car be able to sue? The human in the self-driving car or the car’s manufacturer? New laws will have to be written to sort all this out.

(MORE: The Government Would Like to Keep Reading Your E-Mail)

How involved — and how careful — are we going to expect the human co-pilot to be? As a Stanford Law School report asks, “Must the ‘drivers’ remain vigilant, their hands on the wheel and their eyes on the road? If not, what are they allowed to do inside or outside, the vehicle?” Can the human in the car drink? Text-message? Read a book? Not surprisingly, the insurance industry is particularly concerned and would like things to move slowly. Insurance companies say all the rules of car insurance may need to be rewritten, with less of the liability put on those operating cars and more on those who manufacture them.

At the signing ceremony for California’s self-driving-car law, Governor Jerry Brown was asked who is responsible when a self-driving car runs a red light. He answered: “I don’t know — whoever owns the car, I would think. But we will work that out. That will be the easiest thing to work out.” Google’s Brin joked, “Self-driving cars don’t run red lights.”

Neither answer is sufficient. Self-driving cars should be legal — and they are likely to start showing up faster and in greater numbers than people expect. But if that is the case, we need to start thinking about the legal questions now. Given the high stakes involved in putting self-guided, self-propelled, high-speed vehicles on the road, “we will work that out” is not good enough.