How a Montreal-made online tool helps sexual harassment victims navigate the legal system

Figuring out if any laws were broken is one of the first hurdles that victims of sexual harassment face. Montrealers Ritika Dutt and Amir Morave have developed a new online tool to help.

Startup searched for area of legal system people needed help with — then the Weinstein scandal broke

The Botler AI team, Amir Moravej and Ritika Dutt, with their strategic adviser, Montreal AI pioneer Yoshua Bengio. (Eva Blue)

Understanding the legalities around sexual harassment and assault can be one of the first hurdles that victims face when deciding whether to pursue an aggressor. 

Montrealers Ritika Dutt and Amir Moravej have developed a new online tool to help.

They launched Botler AI, a robot that scans thousands of court documents on behalf of each user, then emails the user legal precedents related to their situation.

​"A lot of people struggle to know if their rights have been violated," Dutt told CBC's Daybreak recently.

"They don't know if they're justified in feeling that way."

Dutt and Moravej are hoping their tool will help more victims come forward.

Using the app and interacting with a robot, people "don't have to be afraid of being judged," Moravej said.

"Basically, a tool to help navigate the legal system in an easier way."

The bot doesn't take any identifiers like name, age or phone number, and all the data it collects from a user is encrypted.

The Weinstein effect

When Dutt and Moravej started building Bolter AI, their goal was to help people navigate legal situations, but they weren't sure which issue to address.

Then the Weinstein scandal broke this fall.

Media mogul Harvey Weinstein was accused of sexual misconduct by at least 75 women. Those women alleged Weinstein had been sexually abusive, describing a range of behaviour from making inappropriate comments to rape. 

Not all of those women went to police.

The scandal and resulting #MeToo movement spurred the Botler AI team to focus on developing their tool for sexual harassment cases.

They designed the friendly bot to engage with users in an informal way that starts with support and emojis.

"Sexual harassment is inexcusable, and I'm really sorry if you've had to deal with it. My aim is to empower you by teaching you what your rights are," the bot says in a chat message.

The friendly bot engages with the user in a chat interface. (

As the chat progresses, the bot asks the user to respond yes or no to its questions.

Leveraging criminal court documents

The bot has access to 300,000 criminal court documents from the United States and Canada.
Botler AI's co-founders are Amir Moravej, 34, and Ritika Dutt, 26. (Meng Jia)

With that information, it's able to analyze if a law was violated, and which law, depending on where the events took place.

It also asks for written details in an incident report which is analyzed and emailed back to the user along with sections of the Criminal Code related to what happened.

Dutt said that the incident report gives the user something with which they can go to authorities.

She hopes this extra help will empower alleged victims who are mulling whether to pursue legal action against their aggressor.

Canadian surveys put the reporting of sexual assault at about five per cent of cases overall — making it the violent crime least likely to be reported to police.

The two developers stress, however, that Botler AI doesn't offer legal advice. That's something only a lawyer can do.

The team is now looking at other areas of the law which can leverage AI, to help people navigate different aspects of the legal system.

With files from CBC's Daybreak


To encourage thoughtful and respectful conversations, first and last names will appear with each submission to CBC/Radio-Canada's online communities (except in children and youth-oriented communities). Pseudonyms will no longer be permitted.

By submitting a comment, you accept that CBC has the right to reproduce and publish that comment in whole or in part, in any manner CBC chooses. Please note that CBC does not endorse the opinions expressed in comments. Comments on this story are moderated according to our Submission Guidelines. Comments are welcome while open. We reserve the right to close comments at any time.