Scholars Ask For Facebook's "Emotional Contagion" Study to Be Withdrawn

Thursday, July 17, 2014 - 01:03 PM

Last month, Facebook announced that it had conducted an experiment in which it purposely showed a group of users only negative posts from their friends' news feeds. The premise was to test what the academics behind the research of "emotional contagion," the notion that moods can spread across networks. Well, everyone was annoyed at being manipulated, and the lead researcher in the study has apologized. The Electronic Privacy Information Center has asked for an investigation from the FTC, saying Facebook was duplicitous, manipulative, and failed to inform users of the experiment. Now, Maryland Law Professor (and friend of TLDR) James Grimmelmann, along with colleague Leslie Meltzer Henry and the faculty of the Berman Institute of Bioethics at Johns Hopkins University have asked the Proceedings of the National Academy of Sciences to retract the Facebook study.

From the letter:

The sticking point is that Facebook users were involuntarily enrolled in the Facebook  Study. They were not notifed of their participation (and have not been to this day); they were not given the opportunity to remove themselves from the experiment. You have written that the research behind the article “may have involved practices that were not fully consistent with the principles of obtaining informed consent.” This is a serious understatement. Te Facebook Study violated broadly accepted norms of research ethics. Its publication violated PNAS’s stated editorial polices. Retraction is the only appropriate response.

...

Participants were not told (and have not been told) that they were part of a study: no one gave them a point of contact for questions or offered them the ability to opt out. No one obtained specifc consent for the study, let alone signed forms. Most of all, it was reasonably foreseeable that the Facebook Study would cause discomfort to participants. The study was designed to demonstrate that  “emotions expressed by friends, via online social networks, infuence our own moods,”25 and the initial hypothesis was that participants in one of the treatment groups would “express increased negativity.”

This isn't the first time Facebook has run afoul of the FTC. In 2011, Facebook agreed to a settlement with the agency after it kept changing its privacy settings, allowing information users had explicitly set as private to be exposed to the world. As part of that settlement, Facebook was barred from " making any further deceptive privacy claims, requires that the company get consumers' approval before it changes the way it shares their data, and requires that it obtain periodic assessments of its privacy practices by independent, third-party auditors for the next 20 years." It's unclear whether this instance violates this 20-year-consent decree.

Tags:

More in:

Comments [1]

Derek

The inability to default News Feed to 'Most Recent' that right there is the real emotional button-pusher.

Jul. 23 2014 12:21 PM

Leave a Comment

Email addresses are required but never displayed.

Supported by

 

Embed the TLDR podcast player

TLDR is a short podcast and blog about the internet by PJ Vogt and Alex Goldman. You can subscribe to our podcast here. You can follow our blog here. We’re also on Twitter, and we play Team Fortress 2 more or less constantly, so find us there if you like to communicate via computer games from six years ago.

Subscribe to Podcast iTunes RSS

Feeds