Monkeypox conspiracy theories spread rapidly on TikTok, says U of A researcher
Countering misinformation quickly is key, say experts
New research from the University of Alberta tracked conspiracy theories about monkeypox on TikTok that made false claims about everything from vaccines to Bill Gates.
A study published Tuesday in health journal JAMA Network identified misinformation trends in videos published on the social media site in May 2022, a few weeks after media began reporting on outbreaks of monkeypox around the world.
U of A's Health Law Institute researcher Timothy Caulfield said the themes that emerged in the videos were like "COVID 2.0."
"You just can't believe how consistent it is. Often monkeypox is the entry point and then they start ranting about vaccines and then they start ranting about bio labs in Ukraine," Caulfield said in an interview Tuesday.
"You believe one conspiracy theory, you're more likely to believe all the conspiracy theories."
Caulfield and his co-author, Marco Zenone, collected 864 videos about monkeypox on TikTok and found that 153 of them espoused a conspiracy theory.
Caulfied said that within an average of 30 hours of being posted, the videos altogether were viewed 1.5 million times, liked 75,000 times and shared 14,000 times.
"Misinformation spreads incredibly quickly. It emerges incredibly quickly and does damage very, very quickly," he said.
The study identifies 11 different themes and types of misinformation, the most common being the false assertion that monkeypox is the next planned pandemic.
A number of videos brought up misinformation about vaccines, while others suggest a conspiracy about vaccines or that Bill Gates speaking about the need to prepare for future pandemics is a sign of his involvement.
Other conspiracy theories tied monkeypox to news of monkeys escaping following a car crash, while some suggest the virus is a sign that the rapture is coming.
Caulfield said as he reviewed videos, he was pleased to see that along with the false information popping up, videos debunking the claims also appeared.
"One of our goals with this study is to highlight how you can use social media platforms, TikTok especially, to sort of [identify] the emerging themes," he said.
"And then you create good, engaging content to push back. And I think the good news is we did see that kind of content."
Response needs to be fast
The study results reflect a wider trend, said Matthew Johnson, education director with non-profit MediaSmarts.
"Any time there is a significant news event of any kind, you see an uptick in different kinds of disinformation and misinformation," he said Tuesday.
Misinformation and disinformation are both types of false information, but disinformation is distinct because it is deliberately spread with the intent to mislead people.
Johnson said there are a number of different ways to battle misinformation and disinformation, starting with platforms like TikTok cracking down by slowing the spread or limiting amplification of misleading videos.
He said there's also an opportunity for researchers, civil organizations and the media to do "prebunking" when it starts to become clear that false information is beginning to spread.
Johnson said prebunking means countering the misleading content before someone has even heard about it, and letting them know they may encounter false information on the subject.
"That's not to say that debunking isn't effective or that it isn't worth doing," he said. "But we also know that the more times someone has seen a false claim, the harder it is to get that false claim out of their heads."
Finally, Johnson said that everyone else can play a role by ensuring that information they're sharing when using social media is verified, and to correct or speak up when they notice something false.
"We've also found in research that even just asking a question, even just saying 'Are you sure that's true?', can be an effective way of limiting the spread of disinformation," he said.