Websites like Twitch need to address suicide prevention, Montreal researcher says

On the heels of Alexandre Taillefer's appearance on Tout Le Monde En Parle where he spoke publicly about his 14-year-old son's death in December, a Montrealer researcher says websites such as Twitch need to be more proactive on suicide prevention.

Only distress signal Alexandre Taillefer's son sent before suicide was on gaming site

Montreal businessman and owner of TéoTaxi is speaking out to raise awareness about the need for suicide prevention online. (Radio-Canada )

When Montreal businessman Alexandre Taillefer opened up about his 14-year-old son Thomas's suicide on the Radio-Canada program Tout Le Monde En Parle, he revealed the only sign that his son was silently struggling came in the form of a message on the online gaming community site Twitch.

The revelation raised many questions about suicide prevention online. It also made researcher Carl-Maria Mörch feel the need to hurry up and finish his doctoral thesis.

A clinical psychologist from France who is doing his PhD at the University of Quebec in Montreal on the topic of using big data to detect suicidal risk online, Mörch said there needs to be more pressure on companies to address the issue. 

"We need more incentives to detect and maybe prevent suicide much better on the internet."

Alexandre Taillefer has begun to speak about the death of his 14 year old son, Thomas, who committed suicide in December. Shari Okeke has been looking into Twitch and suicide prevention efforts online.

What is Twitch?

Owned by Amazon, Twitch is a live-streaming video platform and an online community for gamers that many people in Quebec hadn't heard of until Taillefer mentioned it on the French-language Radio-Canada talk show. 

The website allows gamers to broadcast themselves playing a video game while their followers watch.

"Broadcasters" have their own channel, similar to YouTube channels. While playing a game, they wear a headset with a small microphone they use to communicate with anyone watching. Those watching can communicate via chat messages on the screen.

The popular Twitch online gaming community site is owned by digital giant Amazon.

Twitch boasts that each month more than 100 million people use its site to watch and discuss video games and says it has 1.7 million "broadcasters."

The Montreal gaming and geek culture blog Girls on Games broadcasts its podcast on Twitch and says the online community is a special place.

"You can follow the people you like and you get notifications when they're online … a lot of people watch to get better at their games or you watch to discover new games," said Catherine Smith-Desbiens, co-editor in chief at

Tech support?

Taillefer, a prominent Montreal businessman and owner of TéoTaxi, said that while no one in his family saw signs his son Thomas was depressed, they later learned the 14-year-old had sent messages on Twitch that clearly used the word "suicide" up to six months before he took his own life.

Taillefer pointed out that Twitch-owner Amazon is able to detect, for example, that you're looking to buy red shoes and alter what you see based on that, but it's done nothing to address this issue on its own platform.

A spokesperson for Twitch, who goes only by the name Chase, refused to be interviewed and provided short responses via email to CBC Montreal's questions.

He said Twitch has a policy that prohibits any activity that may endanger life or lead to physical harm. Chase also pointed out no message was flagged or reported to Twitch.

Taillefer has also said the person his son reached out to did not flag any message.

'Create opportunities' for help

Mörch says Twitch and other online companies need to be more proactive when it comes to suicide prevention.

Carl-Maria Mörch is doing his PhD at UQAM on the topic of using big data to detect suicidal risk online. (Submitted by Carl-Maria Mörch )

"It doesn't mean these companies are responsible for what the people have said on their website, but they should absolutely create opportunities for these people to be helped," he said.

There needs to be better coordination with health care professionals, who could even help train moderators who watch online conversations, he said.

"Facebook and Twitter have implemented ways of reporting suicide risk in a much more transparent way so they've been more public and open about it," Mörch said.

As for creating an algorithm that could automatically detect messages about suicide, Mörch says it is feasible but there are challenges.

The biggest one he sees is the risk of what else could happen with data used to determine someone is having suicidal thoughts.

"Knowing that you have a suicidal risk could maybe have an impact on your credit score, your ability to own a house, borrow money… it could be used against us," he said.

Still, online companies should do more: create partnerships with researchers and create more awareness and education about suicide prevention online, Mörch said.

Need to talk? 


Shari Okeke is writer/broadcaster for Daybreak on CBC Radio, and creator of Mic Drop, an award-winning CBC original podcast.