Facebook’s infuriating mood manipulation study: What I’m afraid the company learned

facebook212

By now you must know that Facebook made you sadder for science.

For a week in 2012, it altered the News Feeds of almost 700,000 users, so that some saw happier posts and some saw sadder posts.

It found that the happy group went on to post happier things and the sad group went on to post sadder things, and published its findings in a prestigious scientific journal with prestigious scientific co-authors.

But the fallout — fast, furious and far-reaching — has been much more interesting, proving once again that the company that pushes all our buttons — even as it gets us to push more of its own — still has the ability to surprise us.

“I guess Big Brother is done with just watching.”

That’s how one reader put it at the end of a great Atlantic piece on how this all went down.

Fuming about Facebook is a sport at this point. We suspect so often that companies like it have violated our privacy, security, content ownership and even civil rights, that it was starting to feel like we’d seen every foul play.

Nope.

That’s one bubble burst: Along with data mines, content generators, eyeballs, billboards and surveillance subjects, the companies that connect us on a massive scale can also treat us as lab rats.

Facebook didn’t get explicit consent from users whose feeds it manipulated; blanket language in the Terms of Service no one reads was good enough.

It didn’t tell those users they’d been part of a study; the news spread this weekend and no one knows who actually saw emotionally altered feeds.

Screen Shot 2014-07-02 at 11.37.31 AM

Consent and disclosure are not mere courtesy. They’re how scientists ensure that experimental subjects are people first. And no one — not even the two academics who co-authored the study — stood up for them.

That pops the second bubble: When even the rigid ethics of science meets the sly legalese of a powerful corporation, the corporation wins.

A Cornell University press release demeaned all academia by offering excuses. The researchers were not involved in the collection of the data, it assured. Only in their study.

Facebook’s excuses were more expected. The Facebook co-author, Adam Kramer, pointed out in a public Facebook post that the news feed manipulation was small. The impact slight.

It so disrespects our intelligence to cast a problem of principle as a problem of degree.

Just treat us like people, dammit.

More than our pride is at stake. As tech author Jaron Lanier wrote in The New York Times:

It is unimaginable that a pharmaceutical firm would be allowed to randomly, secretly sneak an experimental drug, no matter how mild, into the drinks of hundreds of thousands of people, just to see what happens, without ever telling those people. Imagine a pharmaceutical researcher saying, “I was only looking at a narrow research question, so I don’t know if my drug harmed anyone, and I haven’t bothered to find out.”

Unfortunately, this seems to be an acceptable attitude when it comes to experimenting with people over social networks. It needs to change.

We’re not imagining things. Because of how businesses operate and don’t operate, because of regulations that exist and don’t exist, but most of all because of the huge imbalance of power a goldmine of data has formed between them the companies and us the individuals, it’s not a matter of debate anymore.

These companies are getting away with too much.

I’ve written about tech in an interesting time. The longer I write, the more I believe that all our anxiety about the companies that both power and manipulate our digital universe can be summed up in one big problem:

They know too much. And we don’t know enough.

Thanks to our outrage, the Facebook study fail is well on its way to becoming a cautionary tale. Other companies have watched. Other companies will be careful.

But that’s the third bubble burst: Maybe all this has taught companies like Facebook is to keep more of their findings to themselves.

Mónica Guzmán is a freelance journalist, speaker and award-winning digital life columnist for GeekWire. You can find her tweeting away @moniguzman, subscribe to her public Facebook posts at facebook.com/moniguzman or reach her via email. See this archive of her weekly GeekWire columns.

  • BigBrotherTinyBrain

    The best way to deal with this is for Facebook users to demand finding out if they were part of the negative group, and then sue the company for any of their misfortune. At this size there’s gotta be some Facebook users, who had it really tough during the study period. Clearly tougher than the control group, so there’s causality provided by the Cornell scientists. Sue them, sue them hard.

  • tsupasat

    Online companies live and die by A/B testing. The companies that test and iterate most and best win. Facebook and others are constantly studying user behavior to get us to think and act in certain ways. We’re just upset because they published a study about it.

    • http://moniguzman.com Monica Guzman

      But when does A/B testing for design improvement cross a line? That’s the question this issue is raising. A/B has been an accepted, useful practice for a long time, and there’s no reason that should stop. New companies, in particular, benefit in a big way from understanding their audience this way. But what about when a company knows a ton about you? So much that its testing might impact on your mood in ways that could conceivably cause harm? Do we still just say, no big deal, it’s just business? New circumstances and new powers are raising new questions.

      • tsupasat

        I agree Facebook’s test was odd and discomfiting. That said, it’s inevitable that an increasing number of companies will know more about us as tracking technologies improve (see this article: http://en.wikipedia.org/wiki/Device_fingerprint ), and that they will seek to make use of that data for competitive advantage. The only way to avoid it is to not do stuff online.

  • http://www.richardsprague.com sprague

    Monica, please read the actual study and tell us if you think Facebook really manipulated anybody. With a d=0.001 and a reliance on very shaky LIWC methods, it’s hard to take the results seriously. Maybe it’s the tech media, not Facebook, trying to manipulate us into thinking they’re more powerful than they really are. http://psychcentral.com/blog/archives/2014/06/23/emotional-contagion-on-facebook-more-like-bad-research-methods/