Marketplace flagged over 800 social media posts with COVID-19 misinformation. Only a fraction were removed
Claims that vaccines cause sterility, fake COVID treatments among posts Facebook hadn't removed as of March 29
The world's social media giants promised to crack down on harmful COVID-19 misinformation that has proliferated since the pandemic began, but a CBC Marketplace investigation found that when problematic posts were flagged, most weren't labelled or removed.
Marketplace producers, between Feb. 3 and Feb. 16, combed through Facebook, Instagram, YouTube and Twitter — using the user tool to flag and report more than 800 posts that breach each company's policies that cover, among other things, posting misinformation.
The result: 12 per cent of the posts were labelled with warnings or taken down entirely. That number jumped to 53 per cent only after Marketplace journalists identified themselves and shared the findings directly with the companies.
WATCH | Full Marketplace report on COVID-19 misinformation:
"Facebook, Twitter, YouTube and Instagram have become the primary superspreaders of misinformation in our world," said Imran Ahmed, founder of the Centre for Countering Digital Hate (CCDH), a non-profit based out of Washington, D.C., which Marketplace collaborated with on this project. "That is a shocking failure to act on misinformation that was handed to them on a silver platter."
Of the 832 posts Marketplace flagged, 391 came from Facebook, 166 from Instagram, 173 from Twitter and 102 from YouTube. The posts had a combined 1.5 million likes and 120,000 comments and covered a range of COVID-19-related topics, but generally circled back to a few central themes: vaccines are dangerous, COVID-19 isn't and don't trust authorities.
Partly fuelled by social media, partly fuelled by the COVID-19 conspiracy movement's effective persuasion tactics, misinformation has contributed to anti-lockdown sentiment, COVID-19 denial and vaccine hesitancy, said Ahmed.
Ahmed says companies such as Facebook are motivated to keep users sharing more content, not less. The more you scroll and the more users consume, the more these companies make from advertisements, which is where most of their revenue is generated, he said.
Marketplace was interested in seeing if the social media giants had made improvements since a 2020 CCDH study, which found the companies only acted on five per cent of misinformation it reported. The CCDH cross-referenced and analyzed CBC's data to ensure problem posts did breach company policies for Facebook, Instagram, YouTube and Twitter.
Facebook, which owns Instagram, took action on about 18 per cent of the posts flagged on both platforms. That number jumped to about 67 per cent after Marketplace shared its findings.
One of the posts that is still up on Facebook weeks later shows a picture of Bill Gates with the headline: "New vaccine causes sterility in 97% of women!" There is no evidence that links coronavirus vaccines to sterility.
Another post shows a homeopathic product, which purportedly "enhanced immunity" against COVID-19 and promised "reduced frequency and shorter duration of symptoms." It sells for $49.99 US.
There are no homeopathic remedies that can cure or alleviate COVID-19 symptoms.
"Completely ridiculous and a little bit infuriating," Timothy Caulfield, a health law and policy expert at the University of Alberta, said after he was shown the post. "Homeopathic is an easy one because it's completely scientifically implausible. That one is so clearly wrong and harmful it should be taken down immediately."
Caulfield says self-reporting tools on social media must lead to action otherwise people will stop using them, but understands the difficulty of monitoring platforms that have billions of users.
"The numbers of messages that have to be evaluated are just huge so I think that is one of the great challenges of social media: how can you meaningfully monitor all of these posts, but we know we need to," said Caulfield. "The challenge is there but the harm is real."
Over the course of Marketplace's test, Facebook did take down a number of prominent accounts on its platforms, including Robert Kennedy Jr.'s Instagram account, which had close to a million followers — the result of a new policy in February that outright prohibited the posting of any anti-vaccination or COVID misinformation. RFK Jr.'s Facebook account, and the Facebook and Instagram accounts of his group, Children's Health Defense — with a combined following of close to 700,000 — are still up.
The company disputed that some of the posts Marketplace flagged violated its protocols, and said in an emailed statement that it had "removed millions of pieces of content on Facebook and Instagram that violate our COVID-19 and vaccine misinformation policies — including two million since February alone."
YouTube, Twitter performed worst
Of the four platforms Marketplace tested, Twitter and YouTube took the least action.
Twitter initially left up all but two of the 173 posts Marketplace reported — including one by a prominent anti-vaccination leader that called the COVID-19 vaccine a "military-grade, deadly bio-weapon." The post yielded more than 2,100 likes and 1,400 retweets.
While Twitter has since removed 18 per cent of the posts Marketplace reported, the company would not say why it initially left up the majority of flagged posts and said it doesn't "directly comment on third-party studies." It pointed to its updated policies, which include a five-strike system for users that would lead to an account deletion.
YouTube didn't take down any of the flagged videos until Marketplace shared its findings. After that, it took down 34 per cent of the reported videos.
But many still remain — including one from a known conspiracist telling his audience that people are sending him information "telling me causes of [COVID] death have been altered." He said he is also receiving information about, "hospitals that are completely dead, nothing happening in there," referencing a viral trend early in the pandemic where people would record videos of empty hospitals to try to back up their claims that COVID-19 wasn't real.
The video has over 700,000 views.
YouTube said in a statement that only some of the videos Marketplace reported violated its policies, and said that since February 2020, it had "removed more than 800,000 videos for violations of our COVID-19 misinformation policies."
Ahmed says CBC's results suggest YouTube, Twitter and Facebook may not be paying as close attention to misinformation until news organizations or legislators put them under the microscope.
"What's really great about this study is that this tells us what they're doing when they think no one is watching."
- Watch full episodes of Marketplace on CBC Gem, the CBC's streaming service.
With files from Jade Prevost-Manuel and Dexter McMillan