Friday October 20, 2017
When cars can drive themselves, who will program their morality?
more stories from this episode
- Me, my wife and our sex robot
- How an autistic teen found a friend in Siri
- Roll over Beethoven, robot composers might be coming after your job
- Can a robot love you back?
- When cars can drive themselves, who will program their morality?
- You're going to want to thank your grandmother's robot
- Will the AI on AI treaty ensure robots don't abuse other robots?
- Full Episode
Imagine driving quickly down a major city street.
A couple of kids come running out of a building on your right, and they end up on the road and in front of you car. If you step on the brakes, you likely won't stop in time.
What do you do? Do you swerve left and into oncoming traffic? Do you swerve right and crash into the building? Or do you put your foot on the brake and hope for the best?
It's a terrible set of choices, and you probably won't have time to consider them in that moment. But what if you could decide on a response ahead of time? Or, rather, what if your car was programmed to make a decision for you? Does it protect the kids? Or does it protect you?
"We've been asked this question quite a bit," says University of Waterloo graduate student Nav Ganti. "Any demo we've been to, basically, this question has come up."
Nav is part of a team of students working on the "Autonomoose," the university's own self-driving car. It's a Lincoln MKZ Hybrid that they've outfitted to drive itself around using mapping software, radar, sonar, laser sensors, cameras and other technologies. It looks and drives like a normal car, but the person in the driver's seat is merely a passenger.
"The speed at which the car can make decisions is far faster than a human can … ten decisions before you've even made one," Nav says. So it's not so likely the car will get itself in trouble in the first place.
But despite their capabilities, people might not totally trust autonomous vehicles just yet.
"Have they taken a bus before?" asks Ian Colwell, another grad student on the Autonomoose team. "Sure, you're not in control, but … this is just something else to put your trust in. Yes, it's a robot, but this thing can make way more decisions and perceive a lot more about the environment than any human."
These guys aren't responsible for making those ethical decisions just yet. But their technology can tell apart different kinds of obstacles – cars, poles, cyclists, pedestrians.
How the car should handle those obstacles is another matter, and will need to be answered sooner or later.