Amid all the buzz about vehicles that drive themselves, there are serious ethical questions facing regulators, manufacturers and the people who will ride in them. If faced with an unavoidable fatal crash, would the car be programmed to save its occupants at all costs or would it sacrifice its passengers for the greater good of saving a group of pedestrians?
"There's this trade-off between the interests of the driver, or rather the passenger who buys the car, and the level of public acceptance versus public outrage," says Azim Shariff of the Culture and Morality Lab at the University of Oregon. Along with researchers from France and the Massachusetts Institute of Technology, Shariff set out to test public attitudes on the cold, hard decisions computer programs will have to make when lives are on the line.
Shariff says it's important that the question is addressed before fully autonomous vehicles begin filling streets and highways in the coming years.
"Car companies are not going to be able to figure out what to do unless they know people's psychology on this."
Researchers created several scenarios in which the driverless car would have to choose between saving its passengers or killing them in order to save a greater number of pedestrians. About 900 adults were surveyed on what they would want the car to do.
The situations involved a car hurtling toward a group of up to 10 pedestrians, with the only options hitting them or driving into a barrier and killing the passenger inside. The number of pedestrians varied among scenarios, as did whether the person facing that situation was in the car, one of the pedestrians or a casual observer.
On one level the result was not surprising. "People do generally — at least they say that they'd be willing to go with the more utilitarian option," Shariff says. "That's especially the case when they're going to be the pedestrians. So they don't want these cars to be willing to drive over 10 people."
Among those who were asked to see themselves as the passenger inside the vehicle, the result was somewhat different. About 25 per cent said the car should save them at all costs, including the deaths of pedestrians.
Driver vs. pedestrian dilemma
Shariff says there are other situations that still need to be tested. "Oftentimes, in fact most of the time, that these unavoidable accidents occur, it's not going to be a decision between two completely equivalent people, say a 40-year-old male driver versus a 40-year-old male pedestrian." He cites instances in which children are in the car or the pedestrians are elderly.
"How is it going to make those decisions about who gets to live and who doesn't?"
Among those on the front lines in the development of autonomous vehicles, the moral decisions programmed into a car are important, but not at the forefront, Paul Godsmark says. He is the chief technology officer at the Canadian Automated Vehicles Centre of Excellence. He says the study raises some valuable points that should be debated.
"The really good question that the paper highlighted was would you be prepared to be the owner and die in the vehicle rather than kill a number of people, and that's where most people had the biggest dilemma."
One option may be to allow the owner of the vehicle to set how it would respond in potentially fatal situations, whether it should save the occupant or pedestrian. Given the choice, it's a tough decision, Godsmark says.
"Until I get in the car, I won't know," he says, adding, "personally I'd probably err on the side of I'd rather kill a few than many."
Either way, it's clear driverless cars will be far better at avoiding collisions than humans. A recent report on autonomous vehicles prepared by the Conference Board of Canada predicts an 80 per cent reduction in traffic fatalities in an era when driverless cars rule the road.
And that time is coming soon. Ontario will begin testing driverless vehicles on its highways beginning Jan. 1, 2016.