Why some experts say it's time for Twitter and Facebook to ban anti-vaccination posts
Social media giants say they direct users to sound information, but won't remove misinformation
As measles cases continue to rise in Canada and the U.S., experts are calling on social media platforms to ban anti-vaccination posts, saying the risks to public health created by misinformation outweigh the right to free speech.
Industry giants like Facebook and Twitter recently announced measures to try to steer users toward scientifically sound information about vaccines, but both told CBC News they won't go so far as to block and remove anti-vaccination material.
"I think this is one of the biggest questions that we're facing right now in this information age: Where is the line between free speech and protecting the public?" said Jonathan Jarry, a biological scientist with McGill University's Office for Science and Society, a centre dedicated to helping the public separate fact from fiction on a variety of science and health topics.
"People are being misled by highly damaging misinformation and I think that in that case, maybe the public good outweighs the right for people to scream 'Fire!' in a theatre where there is no fire."
'No genuine controversy '
"Vaccine hesitancy" is one of the top 10 threats to global health, according to the World Health Organization. Misinformation claiming that vaccines are unsafe or cause "vaccine injuries" such as autism (a theory put forward in a research paper more than 20 years ago that has since been debunked and retracted) frightens some parents and fuels that hesitancy to have their children immunized.
On Tuesday, the head of Gavi, a global alliance committed to increasing immunization, called on social media companies to take down posts that contain false information about vaccines, saying the spread of such content "kills people."
"We have to think about it as a disease," Gavi CEO Dr. Seth Berkley said in Geneva. "This spreads at the speed of light, literally."
Both the Public Health Agency of Canada and the U.S. Centers for Disease Control and Prevention have emphasized the importance of combating that misinformation as a way to stop the current rise of measles, a highly contagious, vaccine-preventable disease that was declared eliminated in Canada in 1998 and in the U.S. in 2000.
Although there can be side-effects from vaccines — most commonly a sore arm or low fever — they are temporary and harmless. The risk of a serious reaction, including anaphylaxis, is less than one in a million, according to the Public Health Agency of Canada, and can be managed by the health-care provider administering the vaccine.
We don't feel like we are the arbiters of truth.- Michele Austin, Twitter Canada
The fact that misinformation makes some parents afraid to vaccinate their children constitutes a public health threat serious enough to outweigh freedom of speech on Facebook, Twitter and other social media platforms, Jarry said.
"There are issues where debate is healthy, where you have people who in good faith ... want to debate the evidence because it's not clear where the consensus lies, and that's perfectly fine — this is how science moves forward," he said.
"But in the case of vaccination ... there is no genuine controversy there. The science is robust."
Effort 'belated,' expert says
Twitter Canada recently launched an initiative called #KnowTheFacts, promoting it as a way to combat vaccine misinformation "by surfacing evidence-based resources" from the Public Health Agency of Canada.
When a user searches Twitter for information on vaccination or immunization, a notification pops up on the screen titled, "Know the facts," with a message below saying, "Information and resources on vaccines and immunizations are available from the Public Health Agency of Canada." Users can then click on a blue button that says "Reach out," which links to an information page about vaccines and immunization on the Government of Canada website.
In an email to CBC News, a spokesperson for the Public Health Agency of Canada confirmed that it provided input on messaging for the notification service, but did not provide any funding for the development or deployment of the service.
Twitter's effort is "a bit belated to the extent that anti-vaccination and vaccine hesitancy have been spreading and becoming increasingly popular online," said Fuyuki Kurasawa, director of the Global Digital Citizenship Lab at York University in Toronto.
Other social media platforms, including Facebook, YouTube and Pinterest, had already announced measures to counter vaccine misinformation. Pinterest is the only platform that has gone as far as blocking and banning anti-vaccination content. As a result of a change implemented last year, if someone searches a term such as "anti-vaccine," a message comes up saying there are no "Pins" (or content) for that search, a company spokesperson said.
Kurasawa applauded the fact that Twitter is collaborating with public health authorities, but said its approach "doesn't necessarily go far enough."
Users can easily ignore the #KnowTheFacts notification. And tweets and hashtags from those sending out anti-vaccination messages still appear as often and as prominently as they did before.
"I think that Twitter is caught in a kind of bind," he said. "It wants to retain and increase its user base, so it doesn't want to actually ban people or users who are spreading these kinds of anti-vaxxing disinformation messages.
"[But] it really should be, I think, banning people who repeatedly spread these kinds of false messages and this kind of false information online."
Not 'arbiters of truth,' Twitter says
Both Twitter and Facebook told CBC News they have no plans to block anti-vaccination posts, nor do they intend to ban known individuals or groups that spread anti-vaccination messages on their platforms.
"We don't feel like we are the arbiters of truth," said Michele Austin, head of government and public policy for Twitter Canada. "We're just trying to help users locate reliable public information on vaccines, should they choose to look for it."
Twitter users like to engage in public conversation in real time, Austin said. She suggested those conversations, rather than censoring content, may be "what's really important in changing somebody's mind [about vaccination]."
Back in March, Facebook announced it would tackle vaccine misinformation by "reducing its distribution and providing people with authoritative information on the topic."
Part of the strategy, it said, was to "reduce the ranking of groups and pages that spread misinformation about vaccinations in News Feed and Search" — meaning they shouldn't pop up as often or as prominently.
But more than two months after that announcement, a Facebook search for "vaccination" conducted by CBC News yielded results topped by a group called "STOP VACCINATION!!! STOP INJURING AND KILLING KIDS!!!"
When CBC News asked Facebook for comment, the company said in an email that its approach is to counter vaccine misinformation with accurate information provided by reputable health authorities, including the World Health Organization and the U.S. Centers for Disease Control and Prevention, rather than removing posts.
It also said that holding opinions against vaccines would not warrant Facebook taking action against those users.
"Facebook empowers people to decide for themselves what to read, trust, and share by informing them with more context, promoting news literacy, and collaborating with academics and other organizations to facilitate conversations around challenging and sensitive issues," a spokesperson said in a statement.
But Jonathan Jarry, the scientist from McGill University, noted that Facebook has banned high-profile figures, such as conspiracy theorist Alex Jones, citing violations of the company's policies on hate speech.
"Maybe we need to do the same thing with ... the biggest figures in the anti-vaccination movement, simply because the misinformation that they are pushing is having very, very concrete, dangerous and lethal consequences for the public," Jarry said.
With files from Reuters