Updated: March 16, 2013 at 10:45 p.m. EST
The robotics revolution is set to bring humans face to face with an old fear — man-made creations as smart and capable as we are but without a moral compass. As robots take on ever more complex roles, the question naturally arises: Who will be responsible when they do something wrong? Manufacturers? Users? Software writers? The answer depends on the robot.
Robots already save us time, money and energy. In the future, they will improve our health care, social welfare and standard of living. The combination of computational power and engineering advances will eventually enable lower-cost in-home care for the disabled, widespread use of driverless cars that may reduce drunk- and distracted-driving accidents and countless home and service-industry uses for robots, from gutter cleaning to food preparation.
But there are bound to be problems. Robot cars will crash. A drone operator will invade someone’s privacy. A robotic lawn mower will run over a neighbor’s cat. Juries sympathetic to the victims of machines will punish entrepreneurs with company-crushing penalties and damages. What should governments do to protect people while preserving space for innovation?
Big, complicated systems on which much public safety depends, like driverless cars, should be built, programmed and sold by manufacturers who take responsibility for ensuring safety and are liable for accidents. Governments should set safety requirements and then let insurers price the risk of the robots based on the manufacturer’s driving record, not the passenger’s.
But not every kind of robotmaker should be responsible for its creations. Ryan Calo of University of Washington Law School argues that to foster start-up-style innovation in home and service robots, the platforms have to be open, meaning that any app developer can write a program that teaches your floor-mopping robot to clean windows too — much as smartphones have been taught to do more than make calls. The fault for any hiccups would be with the app developer or the user.
None of that means there won’t be accidents as we look to robots to improve life. But at least we’ll know who to blame.
The original version of this article has been updated to reflect Ryan Calo’s research.
MORE: Cover Story: Drone Home
Next Write a Constitution