By now you must know that Facebook made you sadder for science.
For a week in 2012, it altered the News Feeds of almost 700,000 users, so that some saw happier posts and some saw sadder posts.
It found that the happy group went on to post happier things and the sad group went on to post sadder things, and published its findings in a prestigious scientific journal with prestigious scientific co-authors.
But the fallout — fast, furious and far-reaching — has been much more interesting, proving once again that the company that pushes all our buttons — even as it gets us to push more of its own — still has the ability to surprise us.
“I guess Big Brother is done with just watching.”
That’s how one reader put it at the end of a great Atlantic piece on how this all went down.
Fuming about Facebook is a sport at this point. We suspect so often that companies like it have violated our privacy, security, content ownership and even civil rights, that it was starting to feel like we’d seen every foul play.
That’s one bubble burst: Along with data mines, content generators, eyeballs, billboards and surveillance subjects, the companies that connect us on a massive scale can also treat us as lab rats.
Facebook didn’t get explicit consent from users whose feeds it manipulated; blanket language in the Terms of Service no one reads was good enough.
It didn’t tell those users they’d been part of a study; the news spread this weekend and no one knows who actually saw emotionally altered feeds.
Consent and disclosure are not mere courtesy. They’re how scientists ensure that experimental subjects are people first. And no one — not even the two academics who co-authored the study — stood up for them.
That pops the second bubble: When even the rigid ethics of science meets the sly legalese of a powerful corporation, the corporation wins.
A Cornell University press release demeaned all academia by offering excuses. The researchers were not involved in the collection of the data, it assured. Only in their study.
Facebook’s excuses were more expected. The Facebook co-author, Adam Kramer, pointed out in a public Facebook post that the news feed manipulation was small. The impact slight.
It so disrespects our intelligence to cast a problem of principle as a problem of degree.
Just treat us like people, dammit.
More than our pride is at stake. As tech author Jaron Lanier wrote in The New York Times:
It is unimaginable that a pharmaceutical firm would be allowed to randomly, secretly sneak an experimental drug, no matter how mild, into the drinks of hundreds of thousands of people, just to see what happens, without ever telling those people. Imagine a pharmaceutical researcher saying, “I was only looking at a narrow research question, so I don’t know if my drug harmed anyone, and I haven’t bothered to find out.”
Unfortunately, this seems to be an acceptable attitude when it comes to experimenting with people over social networks. It needs to change.
We’re not imagining things. Because of how businesses operate and don’t operate, because of regulations that exist and don’t exist, but most of all because of the huge imbalance of power a goldmine of data has formed between them the companies and us the individuals, it’s not a matter of debate anymore.
These companies are getting away with too much.
I’ve written about tech in an interesting time. The longer I write, the more I believe that all our anxiety about the companies that both power and manipulate our digital universe can be summed up in one big problem:
They know too much. And we don’t know enough.
Thanks to our outrage, the Facebook study fail is well on its way to becoming a cautionary tale. Other companies have watched. Other companies will be careful.
But that’s the third bubble burst: Maybe all this has taught companies like Facebook is to keep more of their findings to themselves.