Don't Let Facebook's Emotional Manipulation Study Make You So Mad

Monday, June 30, 2014 - 08:29 AM

Last week, Facebook announced it had conducted an experiment on some of its users without their knowledge or permission. 

Here’s how it worked. For a short amount of time, one group of users only saw negative posts from their friends in their news feed, while others only saw positive posts. The idea was to test whether moods can be contagious across networks. The researchers believe that the answer is yes, that if suddenly everyone on your Facebook seems to be melancholy, you’re likely to write a slightly sadder post as well. 

News of the study spread over the weekend, and now people are very angry at Facebook, enough so that the lead researcher on the study has publicly apologized. But why?

The main thrust of the objection is that people say that they don’t like to be emotionally manipulated. The problem is, that’s just not true.

Here’s one example. We watch TV just so that professional liars can pretend to be in love, or sad, or scared, because we want to trick our hearts into feeling those things too. And even during that lie that we’re enjoying, there are breaks where advertisers tell us shorter lies, not because they want to entertain us but because they want to manipulate us into buying dumb stuff.

Of course, consent and transparency in these manipulations are important, and Facebook’s experiment contained neither. On the other hand, the manipulations we receive from TV, film, or even a decent web series are so much more effective than what Facebook can do. In his apology, the lead researcher admitted as much:  “…the actual impact on people in the experiment was the minimal amount to statistically detect it -- the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.” Contrast that with Friday Night Lights, which will make the most emotionally resilient person cry at least once per season. 

All that said, the more interesting problems with the Facebook study are about academic ethics. 

First, there’s the consent problem. You don’t do studies on humans who haven’t given informed consent. (Facebook’s Terms of Service say they might anonymously use your information for “research,” which is a far cry from telling people you might try to make the feel sad for science.) 

And over at Forbes, Kashmir Hill noted that Facebook didn’t submit the study to a university ethics review board for approval — the company just internally decided the experiment was ethically OK. Bad oversight! 

Lastly, it’s not clear the study was even well designed. The Atlantic’s Robinson Meyer talked to an expert who explained that Facebook’s methodology for telling happy posts form sad ones is fairly crude: 

Here are two sample tweets (or status updates) that are not uncommon:

“I am not happy.
“I am not having a great day.”
An independent rater or judge would rate these two tweets as negative — they’re clearly expressing a negative emotion. That would be +2 on the negative scale, and 0 on the positive scale.

But the LIWC 2007 tool doesn’t see it that way. Instead, it would rate these two tweets as scoring +2 for positive (because of the words “great” and “happy”) and +2 for negative (because of the word “not” in both texts).

It’s hard not to like this last idea, that Facebook may’ve angered the entire internet over a study that could fundamentally be bunk. 

Tags:

More in:

Comments [10]

mt

There are so many good comments here that reflect my views. Your comparisons about television and TV advertising are completely bogus, as outlined above. To extend the rebuttal to your argument that we like to be manipulated consciously, therefore it must be alright to manipulate us unconsciously, this idea was clearly and legislatively put to bed long ago by the fact that they made subliminal advertising illegal. That pretty clearly shows that any kind of subconscious manipulation of our psychology is criminally wrong. This all proves that FB needs to be put under the full scrutiny of the law. New media needs legislative control. It's practices need to be fully examined by an objective body, and a set of guidelines need to be imposed so that users will not have their civil rights and personhood violated just because they trusted a company (Microsoft tells us is so trustworthy) and then clicked an accept button without reading the fine print. In the mean time I urge all users to abandon FB, it has shown itself time and again to have no concept of ethics and only in doing this will you end the unbelievable level of contempt it shows toward its users, or as we can now legitimately call you, their marks.

Jul. 01 2014 07:53 PM
James Bleck from NYC

Does it take considerable effort to be this obtuse? Or does it just come off handily for you, PJ Vogt? The comments above already cover most of the ground I was inclined to intone on. But there's also the question of gatekeeper power: if Facebook can successfully skew feeds to secure effective emotional responses -- all without a user's knowledge that such extensive information-skewing is going on - what's to say they won't eventually use this to market products? Or run political ads? - in each case skewing results overwhelmingly or exclusively in favor of a paying advertiser to the exclusion of competing products or candidates. (With the obvious toll to competitors: pay us or this could very well happen to the feeds of voters/consumers). The tailoring involved and the lack of transparency in Faceook's algorithms make detection - by users or competitors- potentially impossible.

Seriously, shame on you, On the Media. Histrionics over potential corporate power in connection with Net Neutrality but (not even well-argued) apologism for Internet giants like Facebook.

Jun. 30 2014 06:54 PM
Walter from USA

The wry title you chose for this piece suggests we all might step back a bit and wonder if this report from FB is an experiment in itself. If we just consider the technological advances all around us that only a few understand, why shouldn't we suspect that the science of psychological manipulations has been advancing beneath our radar all along as well.

Jun. 30 2014 03:21 PM

I'm so disappointed in this article I signed up just to comment on it.

Most people have already covered what I wanted to say, but here's a link to the most comprehensive article I've read on this subject, it analyses many aspects of and opinions on the problem. Please PLEASE read this!

http://codingconduct.tumblr.com/post/90242838320/frame-clashes-or-why-the-facebook-emotion-experiment

They've disregarded due scientific process. As for the argument that 'TV is emotionally manipulative so-what', you're completely missing the point. This is taken from the article:

'We are used to emotional appeals in “face to face communication” or “advertising” frames. We are used to (and willingly expose ourselves to) emotional effects in “fictional media consumption” frames. But that “online social networks” intentionally affect emotions through algorithms is new, unusual to this frame and therefore feels “manipulative”, “creepy” to some.'

Jun. 30 2014 02:37 PM

It pains me to see that On the Media got this one so very wrong. As any researcher who works with human subjects knows, and is trained once-annually to remember, the history of human research is a dark one. That's why we have strong protections for people who help us conduct research. Altering the moods of hundreds of thousands of people, to an extent that could not be foreseen prior to the study, carries potentially substantial risk to some individuals. At the very least, you ask permission first and inform them of that risk. We should not take this responsibility lightly, as this article seems to do.

Jun. 30 2014 01:19 PM
Frank from Chicago

I am absolutely baffled with the manipulation analogy and how it correlates to television viewing.

Yes, television programs – whether scripted or 'reality' – are heavily edited to set the tone of the genre (comedy, drama, fantasy, romance, etc.). However, the audience is aware that they're viewing a taped program and tune in for the entertainment value. They're cognizant that there are actors and actresses in that little box and understand the program will be over in thirty minutes or an hour: A starship explodes; contestant misses the winning question; zombies ate your child; cops arrest the bad guys; Lassie finds Timmy in yet another well. The viewer is informed.

What transpires inside that self-contained world of television has no direct effect on the audience's lives. The television is neutral until a viewer selects a program. Don't like the entertainment or genre? Change the channel or turn the television off. In the Facebook experiment, the participants did not have such an opportunity. Instead, the Facebook users were unknowingly force-fed specific emotions that apparently involved posts from other people they knew. There was no remote control.

These 'participants' were exposed to posts that would paint either a sugarbowl or poop-sack world. This is not entertainment, especially when involving emotions that may have been placed at risk.

Conducting this research "For Science" cannot justify the lack of informed consent.

Jun. 30 2014 12:09 PM

"The main thrust of the objection is that people say that they don’t like to be emotionally manipulated. The problem is, that’s just not true."

Where have you seen this "main thrust?" I've seen people objecting to be manipulated *without their permission*. People expect TV, movies, books, blog posts that can't make up their mind, to try to manipulate me. People don't (well, didn't) expect Facebook too.

And there's that whole ethics too, but's only worthy of being buried halfway through the story, so whatever.

Jun. 30 2014 10:43 AM
Jon Henry from South Korea

I feel like you are attacking a straw man here. The problem is not that people do not want to be emotionally manipulated. And even if that were the case, the fact that there are worse forms of media manipulation does not justify this one. Especially since those worse forms of manipulation are heavily criticized by both the public and critics alike. That's why there are so many organizations to minimize or eliminate advertising towards children. That should be why we even have a show like OTM. The problem people have with this is that it happened at all. And the problem critics should have with this is that people are stupidly putting their faith in FB when they have absolutely no reason to trust them and many, many warning signs that they should not trust them.

All media, but particularly public media, have a duty to be watchdogs against the government and against large corporations who aim to manipulate the public. You should have the same critical view towards FB that TV critics had back in the day. The fact that you do finally explain the problems with this are weakened by your overall thesis.

Jun. 30 2014 10:25 AM
Ethan Kent from Chicago, IL

Love the show, etc. But a typo: you say “form”; you mean “from.”

Jun. 30 2014 09:59 AM
Dan Mitchell

I've taken to hearing these "tl;dr" posts in the voice of the pimply-teenager character from "The Simpsons."

Jun. 30 2014 09:37 AM

Leave a Comment

Email addresses are required but never displayed.

Supported by

 

Embed the TLDR podcast player

TLDR is a short podcast and blog about the internet by PJ Vogt and Alex Goldman. You can subscribe to our podcast here. You can follow our blog here. We’re also on Twitter, and we play Team Fortress 2 more or less constantly, so find us there if you like to communicate via computer games from six years ago.

Subscribe to Podcast iTunes RSS

Feeds