Could robots be better at listening to abused children?

AI may be able to step in where police and social workers lose their sense of neutrality.
Robots like these are being used to interview children suspected of suffering abuse. (Zachary Henkel)

Would a child open up to a robot interviewer about being abused?

That's the question Mississippi State University computer scientists Cindy Bethel and Zachary Henkel have been exploring.

Children's accounts are often vital in cases of abuse, but even trained police interviewers can find it difficult to remain neutral when talking to kids.

Therefore, robots just might be the key to reducing bias in child abuse cases.
Cindy Bethel.

"It's common for children who've experienced maltreatment and abuse to talk to their toys or pets, but they find it very difficult to talk to adults," Cindy says. "I wasn't sure if children talking to robots would work or not but I thought it would be worth investigating."

What might the advantages of a robot interviewer be?

"In comparison to human interviewers, the robot is able to be very consistent and controlled with its reactions," Zack explains. "The second advantage is that kids may not perceive the robot as being so socially judgmental or opinionated like they would another human being."

The robot also has a variety of sensors and logging abilities which can provide very valuable data.
Zack Henkel.

As for the challenges?

"We need to be aware of whether the child realizes that the information they are providing is going to be used in some way or not," Cindy says.

"It's also possible that with some robots children may judge them to be more like a toy and so they may engage in a playful way which could lead to inaccurate accounts of what happened," Zack says.

Where do things stand now?

"So far we haven't encountered any cases where children are less willing to open up to a robot in comparison to a human, but we definitely need to explore more topics and more groups of kids before we can be definitive on that," Zack explains.

"And then we're looking at… what type of information gathering we need to do with the robots to make it admissible in legal cases," Cindy adds. "We're excited about exploring this."


To encourage thoughtful and respectful conversations, first and last names will appear with each submission to CBC/Radio-Canada's online communities (except in children and youth-oriented communities). Pseudonyms will no longer be permitted.

By submitting a comment, you accept that CBC has the right to reproduce and publish that comment in whole or in part, in any manner CBC chooses. Please note that CBC does not endorse the opinions expressed in comments. Comments on this story are moderated according to our Submission Guidelines. Comments are welcome while open. We reserve the right to close comments at any time.