TikTok says it’s cracking down on dangerous challenges. Will it be enough?

Published 2021-11-18 08:00

TikTokers say more should be done to stop harmful content


You’ve probably noticed challenges on TikTok that encourage kids to do dangerous or harmful things.

TikTokers like 14-year-old Kate Roman from Toronto, Ontario, say the platform needs to do more to get rid of those posts.

“I don’t think what they’re doing is enough,” Kate told CBC Kids News.

This week TikTok tried to address that.

The social media platform hired a company to look into those issues for them.

And on Nov. 17, TikTok announced the findings of that report, and a plan to tackle the problem.

But some people say TikTok isn’t meeting their expectations around safety, and they have doubts about the company’s plans to fix things.

“I don’t think what they’re doing is enough, I think they’re kind of doing the bare minimum.” - TikToker Kate Roman, 14

The story behind the report

A few months ago TikTok hired Praesidio Safeguarding — a consulting agency that researches online environments — to speak to teens, parents and teachers from 10 countries about dangerous challenges and hoaxes.

Why? Alexandra Evans, TikTok's head of safety public policy in Europe, said in an email, “We take our responsibility to support and protect teens seriously — giving them tools to stay in control, mitigating risks they might face, and building age-appropriate experiences — so they can safely make the most of what TikTok has to offer them.”

The independent consulting agency surveyed 10,900 people across the globe to help it understand why those types of posts get the attention they do.

TikTok also had a panel of 12 youth-safety experts review the report once it was finished.

One finding showed that nearly half of the teens interviewed wanted more information to help understand the risks involved with online challenges and hoaxes.

The report also suggested that the majority of teens did not see sharing hoaxes as a problem, and they didn’t necessarily see it as fake.

What's the deal with hoaxes and challenges?

Kate mentioned the “slap a teacher” challenge and the “devious licks” challenge as two examples among the “countless trends” proving to be harmful.

“Slap a teacher” encouraged people to assault school staff without getting caught. “Devious licks” involved stealing things from school and vandalizing.

As soon as one trend ends, another one starts that encourages “bad behaviour after bad behaviour,” she said.

TikTok representative presents in a virtual meeting.

Alexandra Evans, head of safety public policy for TikTok in Europe, is shown during a webinar on Nov. 15 where she announced the findings of a report into dangerous challenges and hoaxes on the platform. (Image credit: TikTok)

Some of these challenges have also turned out to be hoaxes.

For example, various news outlets have reported that there was little evidence that the “slap a teacher” challenge was actually happening.

Another example of a hoax is the “Momo” challenge, where a creepy sculpture was said to have popped up in the middle of a video, ordering people to do violent and dangerous things.

Reports suggest the “Momo” challenge is more rumour than fact.

How safe do teens feel on the app now?

Kate said that in her experience, videos some might consider offensive, but aren’t actually dangerous, were being taken down.

But she said it doesn’t seem like TikTok is doing enough to remove content that could potentially lead to self-harm.

Kate Roman and Amelie Timers TikTok accounts.

TikTok profiles of Kate Roman, from Toronto, Ontario, and Amélie Timer, from Vancouver, British Columbia. (Image credit: k8roman/amelietpovs/TikTok, graphic design by Philip Street/CBC)

TikToker Amélie Timer, from Vancouver, British Columbia, said she doesn’t think TikTok’s reporting feature — which allows users to flag problematic videos — works very well.

The 16-year-old said she has reported what she thought were inappropriate videos before, only to receive notifications from the app saying the videos would be left alone because it found nothing wrong with them.

“What’s the point of the report feature if it’s not being used properly?,” she said.

Can we trust that TikTok will get serious about banning dangerous content?

TikTok said it would roll out new safety features globally over the next few months.

Those changes include new technology to detect a sudden spike in hashtags related to harmful challenges and hoaxes, as well as a warning prompt that pops up when people search for them.

Still, some say the new features might not solve all of the problems on the app. Why?

Warning from TikTok says Learn to recognise harmful challenges and hoaxes.

This is a new feature TikTok announced to flag and inform its audience of dangerous challenges and hoaxes. (Image submitted by TikTok, graphics by Philip Street/CBC)

Kate Tilleczek, an expert in youth and the digital age at Toronto’s York University, said it’s important to think about how much money TikTok makes when somebody clicks on these videos.

“You leave [regulation] in the hands of folks who are making billions of dollars to do the right thing by kids, and I’m always thinking: ‘They’re not going to do that,’” she said.

Tilleczek was speaking from her own experience and expertise. She did not have access to TikTok's report before speaking with CBC Kids News.

Suggestions for TikTok

Tilleczek offered up one solution: Partner with kids to create content so they can address dangerous challenges and hoaxes amongst each other.

“Young people are getting schooled either by themselves or through media literacy,” she said. “I think that could go a long way if it's sort of a peer-to-peer conversation.”

The report that TikTok commissioned also suggested that peer mentoring programmes and initiatives led by young people was a good idea.

For Amélie, a warning doesn’t go far enough.

“A lot of the time warnings just kind of fly past your head and you don’t really notice them as much. So if something is especially dangerous, I think it should be taken down,” she said.

Kate said more needs to be done to pull down videos before they get views and attract attention.

She said the only way to control harmful videos is to either have a separate page dedicated to TikTok’s younger audience or ban that kind of content altogether.

“There are a lot of kids on this app. Parents shouldn’t have to worry.”

Have more questions? We'll do our best to look into it for you. Ask for permission from your parent or guardian and email us at cbckidsnews@cbc.ca.

CLARIFICATION: CBC Kids News added a few lines to this story after it published to clarify that the people we interviewed had not seen TikTok's report before they spoke with us. We also wanted to clarify TikTok's goal in publishing the report.

Was this story worth reading?