Facebook says it's tackling trolls and fake news in N.L. election, but steps back from censorship
The social media monolith says it has a team dedicated to election integrity, battling misinformation
The world's largest social media platform has Newfoundland and Labrador under a microscope.
Or so it claims. Facebook says it's taken a keen interest in Canadian politics following the 2016 U.S. election, when false information spread like a virus from newsfeed to newsfeed.
In response to a federal report on cybersecurity in 2017, the company says it has steadily ramped up how it tackles bots, trolls and fake news during Canadian campaigns — including Thursday's election.
"We haven't been made aware of anything of that nature," said Kevin Chan, a public-policy manager with the domestic arm of the multinational company.
"We get occasional reports about people who may be putting things on the platform that may be satirical … [but we haven't seen] co-ordinated, inauthentic behaviour."
Chan took a call with reporters Wednesday to brief media on Facebook's "election integrity" strategy, just days after the New York Times published an op-ed from the company's co-founder that called on governments to regulate and break up the company.
Chan laid out Facebook's goals, which lean heavily on promoting political discussion on their platforms, including Messenger, WhatsApp and Instagram. In a pre-written statement, Chan framed the company's strategy as a way to help people "have a voice in the democratic process."
The company is "committed to making Facebook a force for good in democracy," he said, pointing to a list of "investments" made since 2017 in stopping the spread of misinformation.
Facebook would not share the amount of those investments, but stressed that it has teams of employees dedicated to carrying out its aims.
What's Facebook doing, then?
Chan reeled off a number of ways the company says it's trying to stop "bad actors" and "false news": it's creating a searchable political advertisement library that'll show the demographics of who saw the ad and how much was spent on it, for instance.
It's also clamping down on the reach of fake news; anything that's identified as being riddled with factual errors will show up in far fewer feeds.
Facebook is also concentrating on booting out what it calls "inauthentic" accounts.
That last point, especially, pushes the company into a gray area. "We have to be careful not to remove real people from the platform," Chan said.
Facebook uses "a variety of different signals that you cannot see on the surface of the account," he explained — such as how often a user interacts with content, scrolls through a feed, or chats with friends — to determine who's authentic and who's only made an account for the purpose of sowing misinformation or fuelling debate.
But even real accounts can cause problems.
CBC News recently reported on the identity of a former Tory staffer who runs political groups in multiple provinces, for instance, effectively spreading meme-based propaganda for conservative politicians.
Chan said he's aware of those accounts, but Facebook has no plans to disclose who's behind them — doing so could, for instance, jeopardize the safety of activists who engage in similar tactics, he said.
The company is also quick to point out it steps back from censorship as much as it can.
"Facebook is not in a position to know the motivations for why people will post things or say things. What we want to do is not censor legitimate speech, but where we have indication [of false information], then we will reduce distribution," Chan said.
That means even intentionally malicious, fake content or parody pages will still have their place on the platform, as the company walks a thin line between safeguarding truth and transparency and allowing users to say whatever's on their minds.
Chan said Facebook would actually welcome more guidance from governments when it comes to hashing out rules for what people can and can't say online, regulating speech in order to remove the burden of that decision from private companies.
Those rules could shrink a grey area that some political figures have already found themselves mired in over the course of this campaign.
Last week, Liberal candidate Hasan Hai, who's been the subject of continuous online harassment, called on his supporters to report a parody page featuring his photograph.
Chan said Facebook had indeed received multiple reports that the page violated its community standards, but found that it fell under the umbrella of "satire" and opted not to remove it.
That same hands-off strategy also appears to allow political groups like NL Strong to flourish — and Facebook allows it, whether or not the flourishing has real-world consequences at the polls.
The bottom line? Facebook wants "to preserve as much space" for discussion as it can without treading into unlawful territory, Chan said.
"If you want to say certain things that may be controversial, that others don't agree with or I don't agree with," he said, "that should be allowed."