Spark

It's getting harder to tell the difference between Google's AI and a real human... and that's a problem

Why AI needs to identify itself
Why AI should announce its identity. (Pixabay)

Should AI be required to announce that it's not human? Until recently, that would have sounded like science fiction. But as artificial intelligence gets more sophisticated in skills like speech recognition, some argue we need to make sure we're not deceived by it.

Google recently announced Duplex in a demo at their I/O developers' festival. It's technology that enables more natural phone conversations between humans and computers.

Google is developing this ability by training a neural net on datasets of anonymized conversations. The idea is that you could tell Google Assistant that you want to make a restaurant reservation, and it could carry that out for you.

Crucially, the conversational datasets were on very constrained topics, such as booking a restaurant reservation, or scheduling a hair appointment. By restricting the topics, they were able to train a system to respond naturally and quickly. Duplex cannot have open-ended, general discussion.



"I was very impressed by it," said Toby Walsh, professor of artificial intelligence at the University of New South Wales in Sydney, Australia, and author of a new book on AI.

Author and AI professor, Toby Walsh

Although he was impressed on a technical level, Walsh has concerns about a technology that can sound human (right down to adding "ums" to its speech).

"There are lots of settings you can think of where this could cause quite significant problems," he explained. "What if it was used by political parties to mass call voters and persuade them to vote in a particular way...What if we're calling children? What if we're calling elderly people who may not have the same mental faculties to be able to distinguish computers from humans?"

Walsh points out that although Duplex currently can only converse in a few very limited domains, technically, there's no reason why it couldn't be trained on many other types of conversations. He also pointed out that widespread use of this technology could put people like receptionists out of jobs.

For its part, Google has said Duplex is still very much in development, and what they announced earlier this month was a demo. It also issued a statement to The Verge clarifying that Duplex would have "disclosure built in."

There are good ethical and other reasons why it is dangerous to build systems in our likeness.- Toby Walsh

That said, if we're heading into a future where technologies can present themselves as human, we should start thinking now about some of the practical and ethical concerns.

Walsh argued that realistic technologies like Duplex should identify themselves as artificial, so that people are not misled. 

"Computers should look like computers. Robots should look like robots," he said. "There are good technical reasons...There are good ethical and other reasons why it is dangerous to build systems in our likeness, in that we will assume too much of them."

now