2014-07-28

Co.Exist

Why Did We Care About the Facebook Contagion Study? Or Did We Even Care At All?

General outrage erupted after Facebook was caught messing with people's News Feeds in order to get an emotional response. Or did it? A new study looks at how people really feel about being manipulated.

Earlier this month, academics, politicians, journalists, and thousands on social media expressed considerable outrage at the news of a Facebook study that toyed with the News Feeds of nearly 700,000 Facebook users. The experiment removed positive or negative posts, then examined whether those changes affected the users' own emotional expression on Facebook. In the weeks following the viral story, at least one American politician and a privacy group filed formal complaints to the Federal Trade Commission, and the U.K.’s data regulation body, the Information Commissioner’s office, launched an investigation into Facebook’s practices.

It’s a fascinating turn of events, especially considering that the first coverage of the study—from science publication New Scientist—didn’t cause much hubbub whatsoever. Instead, blogger Sophie Weiner lit the story’s real fuse when she picked up on the idea that none of the test subjects—Facebook users—had given their informed consent for the experiment, aside from glossing over the Data Use Policy terms when they signed up for the service.

But if Weiner hadn’t found the consent angle, would anyone have been bothered? According to a recent Microsoft Research working paper that gauged reactions to the Facebook debacle, the majority of people who weren’t primed by media reports to be angry in the first place didn’t care much about Facebook’s actions at all.

In one survey asking respondents on Amazon’s Mechanical Turk to look at the ethics of the Facebook study compared to other university studies, only 23% of those who hadn’t already been made aware of the news said Facebook shouldn’t have been allowed to proceed. In a separate group that was asked how they felt about certain conditions of the Facebook experiment, 24% said the experiment should be given the axe. A much larger percentage (47% and 46%, respectively) of respondents who had already heard of the study agreed that the experiment should have been shut down.

So, why the disconnect? Well, we know that the news colors opinions. People who had already heard of the study probably also heard that it was a controversy—and the presence of conflict likely shaped their views. But one also has to take into account possible selection bias: The respondents to Microsoft's survey were Amazon Mechanical Turk users, a tech-savvy crowd used to whipping through surveys at light speed. Perhaps the majority didn't express much outrage because they're paid to do market research (and they're familiar with it) anyway, a category the Facebook study probably would have fallen into.

But despite its limitations, the Microsoft study did yield some thought-provoking insights. When Microsoft Research asked survey respondents how they'd feel if certain conditions of the Facebook experiment were altered, more subjects wanted to shut down the research if it a) only removed positive posts from people's News Feeds and b) vowed that the experiment had nothing to do with advertising. The first part makes sense—but the second is odd. Why would users react to a promise not to use the study to better target ads? Did that backfire?

The authors of the Microsoft Research paper declined an interview. But to Elizabeth Buchanan, director of the Center for Applied Ethics at the University of Wisconsin-Stout and a leading scholar on Internet research, the backfired promise effect could suggest something important about how we think our lives intertwine with Facebook's online environment.

"Facebook is a very natural part of our lives now, and we don't necessarily think of it as something distinct from our experiences," Buchanan says. "The ads are appearing to us, and we're not thinking of them as intentional in some way."

But mention the possibility that Facebook could use the study to better target ads, and the illusion of a safe social space is shattered. "If that's the case that we wouldn't want Facebook to better target ads, I think it in a sense it violates the natural experience that we like to think we're having," Buchanan says. "We buy into the man behind the curtain."

Buchanan has a number of other criticisms of the Microsoft study—because of its design, she doesn't think it necessarily reflects the views of the public at large. But she does note that it's only now, once the initial furor has died down, that more nuance has been able to enter the conversation about Facebook's research. Last week, Nature published an op-ed defending Facebook's research on behalf of 27 other social scientists. Facebook changed its News Feed algorithms all the time anyway, they wrote, and the reaction to a study like this could have a chilling effect on future research.

Much of that future research will be dictated by new collaborations between industry and academia—like Facebook Research, or Microsoft Research, for that matter. And maybe, Buchanan suggests, academics and other researchers will have to find a better way of delineating new ethical standards moving forward. Older ideas might not apply.

But the outrage at the Facebook emotional contagion study spilled outside of the academy. And if the average Facebook user wasn't angry about traditional academic standards being violated, what was anyone angry about at all?

It could be lack of consent. Or the possibility that Facebook has some sort of ability to toy with our emotions. Or, it could be that Facebook users were just delivered this bucket of cold water to the face: Facebook is a company, not a natural environment.

"I think it's a moment of awareness," Buchanan says. "It's jarring for the average user [to see] how much of the Facebook experience is manipulated, and that it isn't true, authentic, sharing—real life. It's not real life."

[Image: Abstract via Shutterstock]

Add New Comment

0 Comments