Notifications

Q&A

Chatbots: Why should we be nice to them?

It's becoming easier than ever to converse with a robot — via applications like Facebook Messenger and Apple's Siri. CBC Radio technology columnist Dan Misener explains why the way we treat chatbots reveals a lot about who we are as humans.

As artificial intelligence becomes a bigger part of our lives, questions arise about how we interact with it

Personal digital assistants like Apple's Siri are becoming increasingly common. And how we interact with them may say a lot about us, says CBC technology columnist Dan Misener. (iStockphoto)

It's becoming easier than ever to converse with a robot — with products and applications like Facebook Messenger, Apple's Siri, the Amazon Echo chatbot Alexa, and Google Now. Tech companies are betting big on conversational bots as the next big thing. 

But you might want to be nice to those bots. CBC Radio technology columnist Dan Misener says the way we treat them reveals a lot about who we are as humans.

Why is it important to start thinking about human-robot relations?

Because the chatbots are coming, and we're just starting to see the first wave.

Facebook recently unveiled its plans for chatbots that live inside its Messenger app. These are conversational robots that you message back and forth with to get news, play games and buy things.

CEO Mark Zuckerberg talks about the Messenger app at the 2015 Facebook F8 Developer Conference. At this year's conference, Zuckerberg announced new chatbots that will be part of Messenger. (Eric Risberg/Associated Press)
And
Facebook's not alone. Google, Amazon, Apple and Microsoft are all working on chatbots and artificial intelligence — robots you talk to, essentially.

"We're going to be interacting much more with artificial agents in the form of robots, in the form of bots, in the form of A.I.," said Kate Darling. She's a researcher at MIT whose speciality is human-robot interaction.

"It is definitely in our very near-term future."

Outside of North America, we've seen huge adoption of chatbots in places like China, where millions of people use the messaging app WeChat — not just to chat with friends, but to chat with bots that let them shop and pay bills.

As we interact with more and more bots, the question arises — how should we treat them?

Do bots care how we talk to them?

They don't "care" in the same way a human cares, of course. If I'm mean to a bot, I'm not going to hurt its feelings.

But we are starting to see bots that are programmed to respond differently based on how you talk to them them. For instance, there's a weather chatbot called Poncho. You can also talk to it about movies. But if you're rude to Poncho, it will call you out and ask for an apology.

And if you continue to be be rude to Poncho, it'll just stop talking to you, giving you the silent treatment for 24 hours.

The weather chatbot Poncho will react to how politely you interact with it. (Poncho/Facebook)
The developers say that Poncho's software assigns you a sort of "niceness score," which you don't see, but which can impact the level of service you get.

If we're rude to bots, will we become rude to humans?

MIT's Kate Darling says research shows a link between people's tendencies for empathy and the way they're willing to treat a robot. (Flavia Schaub/katedarling.org)
In the field of robot ethics, this is one of the big concerns.

While you won't hurt the bot's feelings, Kate Darling says the way you treat a robot can say a lot about you.

"We've actually done some research that shows that there is a relationship between people's tendencies for empathy and the way that they're willing to treat a robot," she said. 

"You know how it's a red flag if your date is nice to you, but rude to the waiter? Maybe if your date is mean to Siri, you should not go on another date with that person."

It raises a very 21st-century dating criterion: "Must be kind to robots."

What about kids' relationships with robots?

There's not a lot of research into this so far, but it is something that, at least anecdotally, some parents are worried about. 

For instance, the venture capitalist Hunter Walk recently wrote about the Amazon Echo — a "smart speaker" which features a voice-activated assistant called Alexa.

Hunter noticed that when his young kids talked to Alexa, they didn't need to say "please" when asking it to do something. As a parent who's trying to teach his kids good manners, he worries about technology that let you boss it around.

Amazon's Echo speaker features a voice-recognition system called Alexa that is designed to control Pandora, Amazon Music and Prime Music services as well as give information on news, weather and traffic. But it doesn't make you say 'please.' (The Associated Press)
When I asked Kate Darling about this, she told me there's no conclusive research on this. So we don't know what kinds of expectations, if any, this sets up in kids. And we don't know if technology that tolerates poor manners translates into rude kids.

But she did say we should be cautious about this, especially when it comes to children, because they're still developing behavioural patterns.

What's next for chatbots?

Mass-market chatbots are a relatively new thing in North America. Besides Facebook's big recent chatbot push, other messaging apps like Kik (which is based here in Canada) and Telegram are working on their own chatbot infrastructure.

But if we want a preview of what mainstream bot usage might look like, we need only look to markets like China with WeChat, or Japan with LINE, where chat and messaging services are used for much more than communicating with friends and colleagues.

These apps, and the bots that live within them, are used for paying bills, buying things and playing games.

WeChat, for instance, has more than 200 million credit cards attached to users' accounts, and it has a pretty robust payment system.

These are the success stories that companies focused on North America want to replicate. So don't be surprised if, in the next year or so, businesses start trying to interact with you through a chatbot.

And if they do, think long and hard about how nice you want to be to those chatbots.

About the Author

Dan Misener

CBC Radio technology columnist

Dan Misener is a technology journalist for CBC radio and CBCNews.ca. Find him on Twitter @misener.

Comments

To encourage thoughtful and respectful conversations, first and last names will appear with each submission to CBC/Radio-Canada's online communities (except in children and youth-oriented communities). Pseudonyms will no longer be permitted.

By submitting a comment, you accept that CBC has the right to reproduce and publish that comment in whole or in part, in any manner CBC chooses. Please note that CBC does not endorse the opinions expressed in comments. Comments on this story are moderated according to our Submission Guidelines. Comments are welcome while open. We reserve the right to close comments at any time.