Canada's privacy watchdog says Facebook's emotional manipulation study "raises some questions" that the agency is following up on with its international counterparts.
Canada's privacy commissioner says it will contact the company for details on Facebook's participation in a mass experiment whereby it manipulated the news feeds of 700,000 users to make them more positive or negative, in an attempt to alter their mood one way or the other.
"It raises some questions that we are following up on," the commissioner's office told CBC News in an email Thursday. "We will be contacting Facebook to seek further details related to this research and have been in touch with some of our international counterparts about the matter."
Consumer and privacy experts say the experiment sets a dangerous precedent for how far corporations can go in order to compel certain consumer behaviours.
In a paper published last month, Facebook data scientist Adam Kramer and two researchers from Cornell University — Jamie E. Guillory and Jeffrey T. Hancock — found that a random sample of Facebook users had their newsfeeds manipulated over a week in January 2012, to see what would happen.
The study concluded that Facebook users were more likely to post negative updates about their lives after the volume of positive information appearing in their feeds had been purposefully reduced by the researchers. The opposite reaction occurred when the number of negative posts appeared in the news feeds.
European regulators are probing the matter, with the U.K.'s Information Commissioner's Office working with counterparts in France and Ireland (where Facebook's European operations are located) to get more details on the study.
The company has moved swiftly to downplay the scandal, noting its behaviour didn't contravene the data use policy that all 1.3 billion Facebook members worldwide agree to abide to, often blindly.
There have been concerns expressed in the scientific community over the paper, which raised issues such as whether participants in the study gave informed consent or were given the opportunity to opt out.
Proceedings of the National Academy of Sciences (PNAS), which oversees publishing of scientific research, issued an expression of concern in a posting Thursday, saying "informed consent and allowing participants to opt out are best practices" in any research involving human subjects.
PNAS acknowledged Facebook is under no obligation to follow those rules because it is a private company and the fact that Facebook didn't follow these rules "does not preclude [Cornell authors'] use of the data."