Twitter's slap-down of Liberal video reveals growing role of social media giants in election

Social media companies are playing a more active role in this election than they have in the past in the name of fighting misinformation. But where should the line be drawn between misinformation and free speech, and how much should they have to reveal about how those decisions are made.

Some are cracking down on misinformation; others promoting misinformation and anti-vaxx protests

Misinformation is flourishing on social media platforms such as Telegram, which is also being used to organize angry protests at Prime Minister Justin Trudeau's election events. (Sean Kilpatrick/The Canadian Press)

They were two words launched into the middle of a Canadian election that exploded online.

A week into the election campaign, Deputy Prime Minister Chrystia Freeland tweeted a video of Conservative Leader Erin O'Toole responding to a question about his views on private, for profit, health care in Canada.

Twitter suddenly slapped a tag on the posts, first in French and then in English, saying they were "manipulated media," apparently because part of O'Toole's answer upholding the principle of universal access had been edited out.

An online furor erupted, spawning criticism and conspiracy theories. The commotion eventually died down but not before the English video was viewed nearly 232,000 times - far more than it likely would have been seen if Twitter had not tagged it.

"If Freeland had posted this doctored video and sent it out into the Twittersphere, a small number of people would have seen it and the conversation would have moved on," said Aengus Bridgman, director of the Canadian Election Misinformation Project, which is monitoring what is happening online during the election.

"The fact that Twitter flagged it as manipulated media meant that, suddenly, the issue and the tweet got an enormous amount of attention and sort of has driven the news cycle."

The incident shines a light on the role of American social media giants in Canada's election - a role that risks being a lot more active than in past campaigns.

Aengus Bridgman, director of the Canadian Election Misinformation Project, says social media companies are trying to navigate a difficult path between controlling misinformation and allowing free speech. (Louis-Marie Philidor/CBC)

Companies such as Facebook and Twitter have come under fire in recent years for not doing enough to stop their platforms from being used to spread misinformation or to manipulate elections and public opinion.

Concern about the role social media companies could play in political campaigns came to the fore in 2018, when it was revealed that British consulting firm Cambridge Analytica used the data of millions of Facebook users to help former U.S. president Donald Trump's successful 2016 election campaign.

Now, faced with the prospect of governments in various countries moving to regulate what happens on their platforms, some of the larger players have been starting to act and have become more proactive, removing, labelling or limiting the visibility of some posts in the name of fighting misinformation or election tampering.

While some social media companies are taking steps to combat misinformation, there are others, such as Telegram, where it is spreading rapidly. Telegram, owned by Russian billionaire Pavel Durov, is also being used by those opposed to COVID-19 vaccines, lockdowns and mask mandates to organize loud, angry protests in recent days at Prime Minister Justin Trudeau's campaign stops.

However, it also raises the question of what role the decisions of corporations based in other countries should play in the middle of a Canadian election when it comes to limiting free speech by removing posts or reducing the number of people who see them.

As part of its updated election integrity policy, Facebook is taking several steps, including beefing up its fact-checking, applying warning labels to posts with false information and blocking fake accounts. Its monitors are also on the lookout for attempts by foreign state actors to influence the course of the election campaign.

Facebook will also be continuing a pilot project it introduced in Canada in February to reduce the amount of political content in the feeds of Canadian users, although it won't reduce the number of paid political ads that they see.


Twitter began acting on posts by politicians even before the election call. In July, it suspended MP Derek Sloan from its platform for 12 hours after he posted a link to a Reuters article about the U.K. deciding against a mass vaccination program for teenagers and urged Canada to do the same. Twitter has also slapped labels on tweets by Ontario MPP Randy Hillier, who has opposed COVID vaccines and lockdowns, and on a manipulated video of NDP Leader Jagmeet Singh posted during the election by a regular Twitter user. It has since been taken down by the user.

Bridgman said Twitter began increasing its enforcement actions several months ago.

I think what Twitter did is a real shot across the bow that is going to shake up how the campaigns are being run-NDP MP Charlie Angus

"This is part of an initiative of Twitter that started last year during COVID-19, when they really ramped up their labelling of media content on the platform," Bridgman explained.

"So they did it initially because there was so much misinformation about COVID-19 circulating, and they were getting a lot of flak for that. So they put this in place. Then it became applied to political content, sort of famously through the American election with Donald Trump in particular. And now it's being applied to the Canadian election."

Bridgman said Twitter has a small army of algorithmically assisted human fact-checkers who manually label problematic tweets.

Bridgman said social media companies find themselves trying to steer a difficult course.

"It's hard for social media companies to win the PR role here," Bridgman said.  "They're in a tight place, because they want to clean up misinformation on their platforms, but they also don't want to be playing kingmaker. That's not in their interest and it's not a good look."

University of Ottawa professor Michael Geist said the incidents highlights the challenges that come with trying to moderate content online.

"The government wants the platforms to be more aggressive in moderating content, including creating liability and incentives for failure to take down content within 24 hours. But this case highlights that many of these cases are very difficult."

NDP candidate Charlie Angus says Twitter labelling the video tweeted by Freeland "manipulated media" is a healthy sign the social media giant is serious about tackling misinformation. (CBC)

New Democratic Party candidate Charlie Angus, who has been part of Canadian and international committees that have studied the role of social media companies in society, said the fact that someone with Freeland's status was given an edited video to tweet out is "very concerning."

"I think what Twitter did is a real shot across the bow that is going to shake up how the campaigns are being run," Angus said, adding the Liberal government is supposed to be fighting disinformation.

"The fact that Twitter was willing to call out someone of the stature of Chrystia Freeland for posting disinformation, I think that's a very healthy sign."

It's also coming at a good time, he said.

"Things are going to heat up a lot, so Twitter stepping in at this point in the campaign, I think, is going to make everyone think they're going to have to be a little bit more careful."

Liberal candidate Nathaniel Erskine-Smith, who was also a key figure in Canadian and international committee hearings into social media companies, said Twitter should have been more transparent about how it makes decisions — not just pointing to a multi-pronged policy the way it did with the video tweeted by Freeland.

Liberal candidate Nathaniel Erskine-Smith says social media companies have to be more transparent about the algorithms they use and how they make decisions. (CBC)

"I assume that it's the isolated editing that they are drawing from there. But again, yes, it removed the reference 'universal access.' But given the nature of the comments in relation to the Saskatchewan MRI policy, I don't think it inaccurately characterized the concern around private pay in a universal system."

Erskine-Smith also questioned the way Twitter applied its policy when it labelled the video tweeted by Freeland.

"I would say they're probably applying that policy in a way that probably is excessive, I think, in the circumstances. But, you know, it's there again, private platforms, but they have a significant influence and not only on our elections, but our public discourse more broadly," he said.

Conservative candidate Bob Zimmer, who served with Angus and Erskine-Smith on the committees that studied the impact of social media companies, and the Conservative Party did not respond to several requests from CBC News for an interview.

When it comes to how the actions of social media companies risk affecting the election, opinions vary.

"I think there is no question that social media companies impact the election now, with their policies around moderation and misinformation," said Bridgman. "I think that that ship has sailed, and it's not a question of whether they will or not. It's a question of how much and how they will do it."

Erskine-Smith, however, is convinced that traditional campaign elements like door-knocking, policies and the debates will have more impact than what happens on social media.

"I don't think it will have a great impact in the end, in so far as I don't think the decisions that the private platforms make will have a great impact in the end."

Daniel Bernhard, executive director of Friends of Canadian Broadcasting, was sharply critical of Facebook's track record when it comes to cracking down on misuse of its platform.

"Canada is foolish to depend for the health of our democracy on the good will and the competency of a company like Facebook that has proven over and over and over again that it is both incapable and unwilling to act in an ethical and democratic way."

Bernhard also wants social media companies to have to divulge the algorithms they are using to govern how their platforms operate.

"These algorithms make hugely consequential editorial choices that have major consequences for politics and democracy. And so their operation but also their transparency, should be a matter of regulation — not of good will and voluntary compliance."

Erskine-Smith would like to see new rules to require more transparency from social media companies, pointing out that Canada has a Broadcast Standards Council but no watchdog for social media companies.

"When we see the power and influence that private platforms do wield in our public discourse, bringing a level of transparency to the way decisions are made by those platforms is incredibly important ... not only in relation to specific, discrete policy decisions, like Twitter's decision to apply its own standards, but how the algorithms themselves are promoting or downgrading certain content," he said.

"As algorithms replace editors, and increasingly so, we do need greater algorithmic transparency."

Elizabeth Thompson can be reached at elizabeth.thompson@cbc.ca



Elizabeth Thompson

Senior reporter

Award-winning reporter Elizabeth Thompson covers Parliament Hill. A veteran of the Montreal Gazette, Sun Media and iPolitics, she currently works with the CBC's Ottawa bureau, specializing in investigative reporting and data journalism. She can be reached at: elizabeth.thompson@cbc.ca.