Misinformation on Reddit has become unmanageable, Alberta moderators say
Company behind website says it takes the issue seriously, but moderators say problems persist
Misinformation flooding some of Alberta's biggest online communities has become a major problem during the COVID-19 pandemic, according to three volunteer Reddit moderators in Edmonton.
In separate interviews with CBC News, the moderators said the number of posts pushing misinformation and conspiracy theories about COVID-19 has surged in their communities on Reddit, a huge social media and news aggregator site that hosts thousands of discussion forums or communities known as subreddits for millions of users.
The moderators say the vast majority of these posts come from users who have never participated in their online communities before.
Thousands of Albertans rely on their local Reddit communities, but volunteer moderators say keeping these places free of pandemic-related misinformation and disinformation — misinformation that is intentionally spread — has become increasingly difficult and time-consuming.
- Alberta feed stores inundated with calls for ivermectin over false claims livestock dewormer treats COVID
Posts questioning the safety of vaccines and masks, linking vaccines with 5G networks, comparing COVID-19 to the flu and promoting unproven treatments like ivermectin have become common, the moderators said.
Harassment, death threats common
"Our moderation queue went from probably one or two of those a week to dozens to hundreds every day, and it is unsustainable — we cannot keep up with that," said a 32-year-old Edmonton man who is a moderator for r/Alberta, a community or subreddit with more than 138,000 subscribers.
CBC News is not naming the man because he has received personalized threats and harassment from people whose posts he has removed in the past and he fears he could be targeted again.
A fellow r/Alberta moderator, who is an Edmonton student in his 20s, said death threats have become common in the team's inbox and the authors of misinformation posts often urge moderators to kill themselves.
r/Alberta is not the only online community dealing with a deluge of misinformation.
Troy Pavlek, a moderator for r/Edmonton, a subreddit with more than 136,000 subscribers, said moderators handle between 50 to 100 misinformation posts per day.
He said users opposed to masks and vaccines pose as health-care workers and spread messages like, "Have you heard that vaccines can kill you?"
"The acceleration, just like our case counts, has gone exponential," Pavlek said.
Company changes inadequate, moderators say
Following the online protests, the company banned one subreddit, quarantined 54 others and added a reporting feature for moderators to flag community interference. (A quarantine means the page won't show up in search results, and when users to try access it directly, they'll be shown a warning message.)
A Reddit spokesperson told CBC News the company takes the issue extremely seriously and has a goal of decreasing the burden placed on moderators.
The spokesperson said the company has changed how it detects users who evade bans by creating new accounts and said moderators can turn to a pool of experienced peers for help when it comes to unexpected traffic surges.
Moderators say the recent changes are inadequate because the flow of misinformation has not slowed down.
"The very spread of these ideas in multiple spaces, multiple times are undoubtedly doing real harms to society, increasing the spread of COVID-19 and almost certainly leading to death," said Rebekah Tromble, director of the Institute for Data, Democracy & Politics at the George Washington University.
Tromble said companies could choose to restructure their platforms to be much smaller, but she didn't think they would do that "because where they achieve profit is through the very large, scalable model." She also talked about the possibility of governments breaking up tech companies through regulation.
She said the European Union has two large pieces of proposed legislation — the Digital Services Act and the Digital Markets Act — that "when they go into effect, are going to have clear spillover effects around the world."
The Edmonton moderators said their teams have been struggling with burnout for months. Some moderators have left their positions and recruiting new ones can be difficult.
Despite being on the receiving end of so much invective, the volunteers said they cannot bring themselves to walk away because that would mean allowing more misinformation to spread in communities they care about.
"If I were to stop doing what I'm doing, then the misinformation just gets worse, and once we've made it a home for misinformation, once we've said this is OK, this is not something we're going to stop, then it sweeps in more and more, and this community, which has been a joy for me in the past, will never be a joy for anyone ever again," Pavlek said.