facebook experiment study mood emotion psychology news feed
Facebook (NYSE:FB) does not know which users it experimented on, it says, to keep the subjects of its emotion study anonymous. Reuters

Facebook Inc. (NASDAQ:FB) has been toying with the emotions of its users in the name of science. After it came to light over the weekend that the social network had manipulated the news feeds of nearly 700,000 users two years ago, one of the lead authors of the study apologized, and said the “research benefits of the paper may not have justified all of this anxiety.”

The experiment involved 689,003 Facebook users, but can any of them find out whether their news feeds were tampered with? Nope. A Facebook spokesperson said the company could not re-identify the survey's subjects even if it wanted to.

To keep the survey anonymous, the researchers who conducted the experiment were not provided with identifying details about the users. The researchers did not have access to their names, and since the information was aggregated, there is no way to connect any of the data from the study to a specific user’s account.

The study was conducted in 2012 on English-speaking Facebook users who were divided into three groups. One batch had posts with words containing “positive” emotions filtered out for a week, another had “negative” posts removed, and the third was studied as a control group.

In a paper describing the experiment, the study showed a small, but “statistically relevant” correlation between emotions Facebook users saw in their news feed, versus the posts that they themselves wrote – those whose news feeds were altered to more prominently show negative posts were more likely to write a negative post themselves.

While the research may have been legal, many in the medical community are questioning whether Facebook acted ethically, by turning hundreds of thousands of its users into unwitting subjects. After all, how many people actually read the terms of service that accompany digital products?

The sort of ethical questions raised by intentionally showing users a barrage of artificially slanted negative, or positive, news feed stories, Facebook said it could not answer. More than one in four Americans have at least one mental illness, and Facebook officially bans users under the age of 13. Did the data scientists who ran the study involve users with psychiatric disorders, or those under 18? Facebook declined to comment.

The world’s largest social network maintains that it conducted the experiment to improve the service in an open way – by publishing the results two years later. While some companies may research ways to improve their service, most do not then publish it in peer-reviewed science journals, which are then viewable by the general population.

A Facebook employee said the data from the study have not yet been implemented into the company’s ever-changing news feed algorithm, but this may change in the future.