The Current

Death of Tesla driver tests future of driverless car

Self-driving car technology has come so far, so fast, many don't realize these cars are already on the road. Now that a man has died after using Tesla's autopilot feature, some question if it all happened too fast without proper safeguards in place.
A Tesla Model S involved in the fatal crash on May 7, 2016. The top third of the car was sheared off by the impact of the collision of the Tesla with a tractor-trailer truck on a nearby highway. (Courtesy Robert VanKavelaar/Handout via Reuters)

Read story transcript

The first-ever death of a passenger in an accident involving a self-driving car in May is raising tough questions about technology, blame, and where the future of autonomous vehicles is headed. 

Matt Bubbers, an automotive editor for Sharp Magazine tells The Current's host Mike Finnerty that unfortunately "it was just a matter of time before this happened."

The so-called "autopilot system" in the Tesla Model S sedan was in use when Joshua D. Brown of Canton, Ohio, was killed May 7 in Williston, Fla. Just one month earlier, Brown had credited the autopilot system for preventing a collision on an interstate.

Bubbers finds using the term autopilot a "misnomer." He elaborates to say that in this tragic accident, "it is important to note that this car didn't drive by itself, it needs a human driver."

Bubbers has test driven the autopilot system in a Tesla and says "it was a surreal experience."

"It's a leap of faith to take your hands off the wheel and feet off the pedals and trust that the car sees the other car in front, the traffic around you, and is going to brake," Bubbers says.

"Every bone in your body is telling you to press the brake pedal when you see a car slowing down in front of you but you have to relax, not do it," Bubbers admits.

The trust relationship drivers have with their autonomous car is of concern for Jason Millar. He  specializes in the ethics and governance of robotics and artificial intelligence. 

"The many different ways that manufacturers can design these automotive features raises really important and interesting questions about how we go about managing the trust relationship that is struck between the driver and technology," Millar tells Finnerty.

Training will become essential as we transition to driverless cars, says Jason Millar who specializes in the ethics and governance of robotics and artificial intelligence. (Beck Diefenbach/Reuters)

Millar feels how a car is designed and the training demanded of the user is something that needs more consideration. He points to the constant simulators pilots experience ever since automated cockpits have prompted complacency.

"It's not just enough to put sensors in a car. I think there's a lot more that goes into planning these types of systems in actual driving conditions that maybe this incident gives us an opportunity to reflect on.

Millar feels drivers need to know what they are getting into and ask themselves, "Do I really understand how the sensors detect vehicles?"

Bubbers says he was surprised how quickly he was comfortable with the autopilot feature but it wasn't perfect.

"Tesla admits it's a Beta… sometimes it's not clear what the car is going to do and what the driver is going to do," he tells Finnerty.

Bubbers recognizes autonomous cars are not going anywhere but feels with the speed of technological advancement in this field, the legislation needs to catch up.

"This crash is really important so we can have this conversation about whether they should be on the road or not."

Listen to the full conversation at the top of this website.

This segment was produced by The Current's Julian Uzielli, Ines Colabrese and Pacinthe Mattar.