Nfld. & Labrador

Facebook says it's tackling trolls and fake news in N.L. election, but steps back from censorship

The social media monolith says it has a whole team dedicated to battling misinformation in Canadian elections — but it's at loggerheads with the company's fear of censorship.

The social media monolith says it has a team dedicated to election integrity, battling misinformation

Facebook is trying to strike a balance between neutrality and stopping the spread of fake news, but still hesitates to censor its users. (Andrew Harnik/Associated Press)

The world's largest social media platform has Newfoundland and Labrador under a microscope.

Or so it claims. Facebook says it's taken a keen interest in Canadian politics following the 2016 U.S. election, when false information spread like a virus from newsfeed to newsfeed. 

In response to a federal report on cybersecurity in 2017, the company says it has steadily ramped up how it tackles bots, trolls and fake news during Canadian campaigns — including Thursday's election.

"We haven't been made aware of anything of that nature," said Kevin Chan, a public-policy manager with the domestic arm of the multinational company.

"We get occasional reports about people who may be putting things on the platform that may be satirical … [but we haven't seen] co-ordinated, inauthentic behaviour."

Chan took a call with reporters Wednesday to brief media on Facebook's "election integrity" strategy, just days after the New York Times published an op-ed from the company's co-founder that called on governments to regulate and break up the company.

Kevin Chan, the head of public policy for Facebook in Canada, spoke to reporters Wednesday about how the platform is clamping down on fake news and political interference. (CBC)

Chan laid out Facebook's goals, which lean heavily on promoting political discussion on their platforms, including Messenger, WhatsApp and Instagram. In a pre-written statement, Chan framed the company's strategy as a way to help people "have a voice in the democratic process."

The company is "committed to making Facebook a force for good in democracy," he said, pointing to a list of "investments" made since 2017 in stopping the spread of misinformation.

Facebook would not share the amount of those investments, but stressed that it has teams of employees dedicated to carrying out its aims.

What's Facebook doing, then?

Chan reeled off a number of ways the company says it's trying to stop "bad actors" and "false news": it's creating a searchable political advertisement library that'll show the demographics of who saw the ad and how much was spent on it, for instance.

It's also clamping down on the reach of fake news; anything that's identified as being riddled with factual errors will show up in far fewer feeds.

Facebook is also concentrating on booting out what it calls "inauthentic" accounts.

That last point, especially, pushes the company into a gray area. "We have to be careful not to remove real people from the platform," Chan said.

Facebook uses "a variety of different signals that you cannot see on the surface of the account," he explained — such as how often a user interacts with content, scrolls through a feed, or chats with friends — to determine who's authentic and who's only made an account for the purpose of sowing misinformation or fuelling debate.

But even real accounts can cause problems.

Balancing act

CBC News recently reported on the identity of a former Tory staffer who runs political groups in multiple provinces, for instance, effectively spreading meme-based propaganda for conservative politicians.

Chan said he's aware of those accounts, but Facebook has no plans to disclose who's behind them — doing so could, for instance, jeopardize the safety of activists who engage in similar tactics, he said.

The Proud groups and NL Strong often share the same content with the same or similar phrasing, but Facebook allows them and won't disclose the identity of their administrators. (Facebook)

The company is also quick to point out it steps back from censorship as much as it can.

"Facebook is not in a position to know the motivations for why people will post things or say things. What we want to do is not censor legitimate speech, but where we have indication [of false information], then we will reduce distribution," Chan said.

Government help?

That means even intentionally malicious, fake content or parody pages will still have their place on the platform, as the company walks a thin line between safeguarding truth and transparency and allowing users to say whatever's on their minds.

Chan said Facebook would actually welcome more guidance from governments when it comes to hashing out rules for what people can and can't say online, regulating speech in order to remove the burden of that decision from private companies.

Those rules could shrink a grey area that some political figures have already found themselves mired in over the course of this campaign.

Last week, Liberal candidate Hasan Hai, who's been the subject of continuous online harassment, called on his supporters to report a parody page featuring his photograph.

A Facebook executive says this page, which appears to impersonate a Liberal candidate, is considered satire and doesn't violate the company's community standards. (CBC)

Chan said Facebook had indeed received multiple reports that the page violated its community standards, but found that it fell under the umbrella of "satire" and opted not to remove it.

That same hands-off strategy also appears to allow political groups like NL Strong to flourish — and Facebook allows it, whether or not the flourishing has real-world consequences at the polls.

The bottom line? Facebook wants "to preserve as much space" for discussion as it can without treading into unlawful territory, Chan said. 

"If you want to say certain things that may be controversial, that others don't agree with or I don't agree with," he said, "that should be allowed."

Read more from CBC Newfoundland and Labrador

Comments

To encourage thoughtful and respectful conversations, first and last names will appear with each submission to CBC/Radio-Canada's online communities (except in children and youth-oriented communities). Pseudonyms will no longer be permitted.

By submitting a comment, you accept that CBC has the right to reproduce and publish that comment in whole or in part, in any manner CBC chooses. Please note that CBC does not endorse the opinions expressed in comments. Comments on this story are moderated according to our Submission Guidelines. Comments are welcome while open. We reserve the right to close comments at any time.