Many people have criticized the study, saying that Facebook played with its users’ emotions. Reuters

Facebook Inc (NASDAQ:FB) is facing severe backlash from users for a 2012 experiment on nearly 700,000 unwitting users, while a researcher, who designed the study, responded to the criticism by attempting to explain the study's goal.

On a public Facebook post, Adam Kramer, one of the company's three data scientists who authored the controversial study, tried to defend the experiment's motive saying that the social network cares about its users’ emotions. However, Kramer also admitted that benefits from the research may not be enough to justify the anxiety it created among users.

“Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone,” Kramer wrote in the post. “I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused.”

In an attempt to determine whether users’ emotional state could be altered by exposure to specific content, Facebook tweaked the algorithms of 689,003 users' accounts, without their consent, so that they would see a significantly low number of either positive or negative posts.

“When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions,” the researchers wrote in the study published in the March issue of the Proceedings of the National Academy of Sciences.

However, many people have criticized the study, saying that Facebook played with its users’ emotions.

“Apparently what many of us feared is already a reality: Facebook is using us as lab rats, and not just to figure out which ads we'll respond to but actually change our emotions,” a post at Animalnewyork.com, the blog that drew attention to the study on Friday, said.

Kramer has said that Facebook is now working on improving its internal review practices. According to him, users’ posts were hidden during the study, but not permanently deleted. While the posts were not visible on the users’ News Feed, they could be seen on friends' timelines.

“And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it -- the result was that people produced an average of one fewer emotional word, per thousand words, over the following week,” Kramer wrote.

Meanwhile, users came up with mixed reactions to Kramer’s explanation on Sunday.

“I love the fact that people want to slam Facebook over such a tiny thing but they don't speak up about the blatant use of far worse tactics to drive sales or manipulate the customer experience. People who fear these things need to learn that all of their actions are optional,” one user wrote.