Why guess someone's mood when an AI can tell you?

AIs now feel empathy. What could go wrong?
So-called "Empathy A.I.s promise to take the guesswork out of reading peoples' emotions. (Pixabay)
Listen10:47

Imagine if an app could tell you, with certainty, whether someone you're with is angry at you, even if they're trying to hide it. Or sad, fatigued, or in any other emotional state. Is that something you'd want to use?

So-called "Empathy A.I.s" are the latest thing in customer service, and a lot of other applications as well.

Recently, the U.S. insurance company MetLife started trying out a digital assistant called "Cogito," which listens in on phone calls between clients and customer service representatives.

Using an algorithm, it will signal to the rep when it detects the client is experiencing emotion, and it will offer advice and tips. Similarly, it will look for signs of fatigue or agitation in the customer service agent, and make suggestions to help. Sort of like an empathy coach.

Sounds great, right? Or, maybe not.

Elaine Sedenberg is the co-director of the Center for Technology, Society & Policy at the University of California, Berkeley, where she is also a PhD candidate.
Elaine Sedenberg (Berkeley School of Information)

She says that while the idea is good in principle, as humans we are designed to empathize with other people in a fuzzy, intuitive way. Adding too much certainty to those intuitions could be fraught with risk, she said.

It could make people believe that someone is frustrated with them, even if they're only having a frustrating day, she suggested.

Sedenberg offers a couple of examples, such as using the technology on a Tinder date, to see whether a date is going well from the other person's perspective.

"We can think that sounds like a Tinder date from hell, but we can also think of how you know it would be very tempting to use that technology," she said.

Perhaps scarier is if empathy A.I. is deployed at high-level international talks, she added.

"What if this type of technology was used in a nuclear arms deal, and it was used by different nation states trying to negotiate something very serious, and trying to decide if the other side is bluffing or is lying? That makes me really uncomfortable."

We can think that sounds like a Tinder date from hell, but we can also think of how you know it would be very tempting to use that technology- Elaine Sedenberg

She noted it's important to remember that the infrastructure for this technology is already in place:  phones have microphones and cameras and the ability to take video.

"We also have large repositories of these data online through our social media accounts or different photo streams that we've posted in the past. So we already have the ability to collect a lot of this information in real time, through surveillance or through our own personal devices."

It's also possible, and likely, that the A.I. could go back in time and see data you know from 10 years ago, when we were first using Facebook or Twitter and posting these things online, she added.

However, the idea of digital empathetic assistance isn't all bad, she said.

It has good applications for people with autism or a disability that makes it particularly difficult to read emotions. And it could be used as a way to monitor, say, long distance truck drivers to measure how fatigued they may be.

Ultimately it will come down to how comfortable people are with outsourcing something that used to be considered the exclusive domain of human intuition, she said.

"I think it is very important that there is a conversation about the technology."