2014-06-30

Co.Exist

Your Emotions Might Have Been Messed With By Researchers On Facebook: Now They're Saying Sorry

"Our goal was never to upset anyone."

Last week, Facebook researchers published a study that now has Facebook users roiling. Over the course of one week in January of 2012, the researchers tinkered with nearly 700,000 people's News Feeds to see if the results would lead to positive or negative expressions of emotion.

The goal, the researchers write in the Proceedings of the National Academy of Sciences, was to see if the emotional content of Facebook posts could be contagious. But after a weekend of reports questioning the ethics of manipulating someone's online experience to elicit an emotional reaction (unbeknownst to users, critics argue), the lead researcher of the report decided to respond last night, via Facebook.

Adam Kramer writes:

Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

It's a funny sort of apology, in that it apologizes for the way that the research was described and not how it was conducted. But there's also one major difference between the Facebook study and other studies that have investigated the same "emotional contagion" phenomenon: the human guinea pig factor.

Unlike a "natural experiment" that observes what happens on Facebook and analyzes the already-available data, here researchers deliberately changed people's News Feeds and gauged the results. For a week, thousands of people experienced Feeds marked by less positive words, while thousands of others experienced Facebook with less negativity. As a result, the researchers wrote that people with less positivity in their Feeds began showing more negativity, and vice versa. (Kramer and others also performed an earlier study which didn't manipulate people's feeds but found that sad emotions traveled through people's networks like a virus.)

Meanwhile, the people exposed to less emotional posts got quieter.

As the Atlantic's Robinson Meyer notes, the experiment was legal, in that Facebook can do whatever it wants for research purposes by users' terms of service agreements. And the study seems pretty consistent with what online marketers do all the time anyway, which is to try and manipulate people's decisions and habits based on what they see on their screens. The difference, though, is that we can generally recognize an advertisement when we see one, or even recognize when sophisticated algorithms show us tailored ones. Academics have different standards for human test subjects.

"What's troubling to me was the fact that the researchers argue that they had gotten informed consent through this experiment because of the data use agreement," says Kathryn Montgomery, a professor of communications at American University who specializes in privacy and technology issues. "I sincerely do not believe that people understand all of the manipulation that goes on with their information and their experiences routinely. That's why we have research protocols for using human subjects."

But outside of the academy's line of ethical questioning, Facebook has faced charges of misleading users and expectations of privacy before. In 2012, Facebook and the Federal Trade Commission settled on a legally binding consent order that stipulated the company obtain consent before making changes to users' privacy settings. In response to the fresh outrage, Kramer noted in his post that Facebook would be incorporating the reaction from the PNAS paper into internal review practices. But only time will tell if the study generates enough concern to get the company to change the rules.

To read Kramer's full explanation, click here.

[Image: Abstract via Shutterstock]

Add New Comment

1 Comments