How a Montreal-made online tool helps sexual harassment victims navigate the legal system
Startup searched for area of legal system people needed help with — then the Weinstein scandal broke
Understanding the legalities around sexual harassment and assault can be one of the first hurdles that victims face when deciding whether to pursue an aggressor.
Montrealers Ritika Dutt and Amir Moravej have developed a new online tool to help.
They launched Botler AI, a robot that scans thousands of court documents on behalf of each user, then emails the user legal precedents related to their situation.
"A lot of people struggle to know if their rights have been violated," Dutt told CBC's Daybreak recently.
"They don't know if they're justified in feeling that way."
Dutt and Moravej are hoping their tool will help more victims come forward.
Using the app and interacting with a robot, people "don't have to be afraid of being judged," Moravej said.
"Basically, a tool to help navigate the legal system in an easier way."
The bot doesn't take any identifiers like name, age or phone number, and all the data it collects from a user is encrypted.
The Weinstein effect
When Dutt and Moravej started building Bolter AI, their goal was to help people navigate legal situations, but they weren't sure which issue to address.
Then the Weinstein scandal broke this fall.
Media mogul Harvey Weinstein was accused of sexual misconduct by at least 75 women. Those women alleged Weinstein had been sexually abusive, describing a range of behaviour from making inappropriate comments to rape.
Not all of those women went to police.
The scandal and resulting #MeToo movement spurred the Botler AI team to focus on developing their tool for sexual harassment cases.
They designed the friendly bot to engage with users in an informal way that starts with support and emojis.
"Sexual harassment is inexcusable, and I'm really sorry if you've had to deal with it. My aim is to empower you by teaching you what your rights are," the bot says in a chat message.
As the chat progresses, the bot asks the user to respond yes or no to its questions.
Leveraging criminal court documents
With that information, it's able to analyze if a law was violated, and which law, depending on where the events took place.
It also asks for written details in an incident report which is analyzed and emailed back to the user along with sections of the Criminal Code related to what happened.
Dutt said that the incident report gives the user something with which they can go to authorities.
She hopes this extra help will empower alleged victims who are mulling whether to pursue legal action against their aggressor.
Canadian surveys put the reporting of sexual assault at about five per cent of cases overall — making it the violent crime least likely to be reported to police.
The two developers stress, however, that Botler AI doesn't offer legal advice. That's something only a lawyer can do.
The team is now looking at other areas of the law which can leverage AI, to help people navigate different aspects of the legal system.
With files from CBC's Daybreak