In wake of Facebook Cambridge Analytica scandal, does the tech sector need a code of conduct?

Facebook's Cambridge Analytica scandal, in which the personal data of millions of Americans was allegedly misused by a consulting firm working for Donald Trump's 2016 U.S. presidential campaign, is renewing calls for a tech sector code of ethics.
According to Canadian data expert Christopher Wylie, Cambridge Analytica misused millions of peoples' information on Facebook. (Dado Ruvic/Reuters)
Listen19:11

Read Story Transcript

Canadian data analytics expert turned whistleblower Christopher Wylie triggered headlines last week after revealing an alleged leak of private Facebook user data.

The tech sector — unlike other fields with a wide public footprint such as medicine or law — has no overarching code of ethics. But Facebook's Cambridge Analytica scandal, in which the personal data of millions of Americans was allegedly misused by a consulting firm working for Donald Trump's 2016 U.S. presidential campaign, is renewing calls for ethics education, says an expert.

According to Casey Fiesler, an assistant professor in the Department of Information Science at the University of Colorado Boulder, "there have been a lot of calls for more ethics education, for computer scientists, for data scientists, for more ethical thinking."

She says that in the U.S., formal ethics training for computer science university students varies from program to program.

"There is some mandatory ethics content in order for the program to be accredited," Fiesler tells The Current's guest host Laura Lynch. "But there are no tests, no licensing, for software development as a profession."

Cambridge Analytica chief executive Alexander Nix leaves the company's offices in central London on March 20, 2018. (Dominic Lipinski/PA via Associated Press)

In the professional field, however, Fiesler says there are several initiatives in place to create codes of ethics in the data science field.

"It's unclear how much teeth any of these codes have because of the lack of a licensing program," she cautions. "Even licensing would be challenging because so many technologists are self-taught."

Part of the challenge creating such a code is that ethics tend to be highly contextual. But it can be difficult to maintain formal guidelines in a constantly evolving industry like technology, says Fiesler.

"Thinking about the ethics of what you're doing is part of that profession, and it's part of what it means to be a good computer scientist," she argues.

Wylie says he's speaking out about his former employer's methods, which he characterized as unethical, because they need to be called out.

The data analytics expert alleged he was tasked with mining tens of millions of Facebook profiles when he worked at data company Cambridge Analytica, which allegedly misused that personal information while working for Donald Trump's 2016 U.S. presidential campaign.

"I think that the algorithms that they have built... using that private data they acquired without consent, is problematic," Wylie told CBC News.

The data-mining firm has denied Wylie's claims.

"This Facebook data was not used by Cambridge Analytica as part of the services it provided to the Donald Trump presidential campaign; personality targeted advertising was not carried out for this client either," Cambridge Analytica said in a statement.

CBC's Adrienne Arsenault speaks with Christopher Wylie 3:29

'This is not an accident'

Facebook keeping this secret for so long is something Wired writer Louise Matsakis says "shows a lack of regard for their users privacy."

She claims there are reports revealing Facebook tried to pursue legal action against the journalists who wanted to come forward.

The data collected was through an app that required the use of Facebook. Matsakis explains when people downloaded the app, they shared their own information — but it also gleaned information from the user's friends list. People who didn't download the app were still susceptible to a data breach in the process.

"It ended up collecting [data from] about 50 million Americans. They were targeting specifically American voters. They were not aiming to collect anyone else's personal information," says Matsakis.

For 10 years, Louise Matsakis has had a Facebook account and offers advice to users on how to prevent data from being shared. 0:35

Matsakis says Facebook knew this could happen long before the alleged details of the Cambridge Analytica came out.

"Facebook's entire business model is collecting as much information on you as it can and then allowing that data to be used by marketers," she tells Lynch.

"That is the whole reason that they become a multi-billion dollar company. This is not an accident. This is exactly how their system was designed."

On Sunday, Facebook said it was was conducting a "comprehensive internal and external review" to determine if the personal data that was reported to be misused still existed.

"If these reports are true, it's a serious abuse of our rules," Facebook VP and depute general counsel Paul Grewal told The Current in a statement.

"All parties involved ... certified to us that they destroyed the data in question," Grewal's statement says. "We will take whatever steps are required to see that the data in question is deleted once and for all and take action against all offending parties."

According to Casey Fiesler, Facebook can prevent a data scandal such as the one involving Cambridge Analytica by giving researchers more access to data, not less. (Dado Ruvic/Illustration/File Photo/Reuters)

An ethics review

Beyond the platform-based ethics of Facebook, Fielser says misrepresenting data in an academic study could have a larger impact than the data scandal.

"This is really problematic because it could discourage people from trusting science [or] from participating in science," she tells Lynch.

Fiesler says Facebook's social media data isn't often used or seen in academic research because the company keeps it private.

"There aren't any real formal processes for an ethics review outside of Facebook for researchers who want to get access to Facebook data," she explains.

Other platforms like Twitter are completely public, offering data for research papers. In those cases, however, someone might never be informed their tweets were used for a study.

Fiesler argues Facebook could prevent the next Cambridge Analytica scandal if they gave researchers more access to data, which would open it up to academic review.

Listen to the full conversation at the top of this page, where you can also share this article across email, Facebook, Twitter and other platforms.


With files from CBC News. This segment was produced by The Current's Pacinthe Mattar, Idella Sturino and Saman Malik.

Comments

To encourage thoughtful and respectful conversations, first and last names will appear with each submission to CBC/Radio-Canada's online communities (except in children and youth-oriented communities). Pseudonyms will no longer be permitted.

By submitting a comment, you accept that CBC has the right to reproduce and publish that comment in whole or in part, in any manner CBC chooses. Please note that CBC does not endorse the opinions expressed in comments. Comments on this story are moderated according to our Submission Guidelines. Comments are welcome while open. We reserve the right to close comments at any time.