Facebook toying with your feelings: what does the experiment mean for you?
Why the internet's furious about News Feed fiddling
Facebook has hit the headlines for all the wrong reasons: last week it emerged that nearly a quarter of a million users' news feeds were deliberately manipulated to see if such manipulation could change their moods. Critics say that's unethical at best and downright evil at worst, and the UK Information Commissioner's Office has announced that it will investigate whether Facebook has breached data protection legislation.
Facebook in privacy shocker. Hold the front page!
This is a bit bigger than the usual "let's move all the privacy settings and make your pics public again" changes Facebook likes to make.
Is it? What actually happened?
For a week in 2012, Facebook data scientists meddled with the news feeds of over 689,000 users as part of a study with Cornell University and the University of California. Some users were shown more negative content; others, more positive. Facebook then analysed those users' own posts to see if the content they were shown had made them more positive or more negative.
Surely sites and social networks analyse user data all the time?
They do, and it's called A/B testing: you give two groups of users different versions of your content and see which is more successful. This goes beyond that, though: Facebook wasn't just observing users, but actively trying to change their emotions.
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
And that's bad because...?
It's bad because nobody was asked whether they wanted to participate in what is effectively a psychological study. Facebook does mention that it'll use your data for research in its terms and conditions, but that bit of the T&Cs wasn't added until after this experiment had already taken place.
It's arguably irresponsible too: how many of the people whose news feeds were made more negative were people with vulnerable emotional states or mental illnesses such as depression?
What did the study find?
The cheerier your feed, the cheerier your posts are likely to be, and vice versa. The more emotional the language used, the more you're likely to post; if your feed is full of fairly flat and unexciting language, you'll be less inclined to join in.
How have people reacted to the news of the study?
The reaction of Erin Kissane (Director of Content at OpenNews) on Twitter was typical: "Get off Facebook. Get your family off Facebook. If you work there, quit. They're f---ing awful."
What does Facebook say about it?
"Mumble mumble mumble. Look! A duck!"
Really?
No. In a statement Facebook said: "This research was conducted for a single week in 2012 and none of the data used was associated with a specific person's Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible."
In a public Facebook post, study co-author Adam Kramer wrote: "Our goal was never to upset anyone... in hindsight, the research benefits of the paper may not have justified all of this anxiety."
So they've apologised?
Kinda. Sorta. Not really. Chief operating officer Sheryl Sandberg made one of those non-apology apologies so beloved of celebrities and politicians: the study was "poorly communicated, and for that communication we apologise. We never meant to upset you." Translation: we're not sorry we did it, but we're sorry that you're annoyed about it.
- 1
- 2
Current page: What does Facebook's experiment mean for you?.
Next Page How deep does the manipulation go?Writer, broadcaster, musician and kitchen gadget obsessive Carrie Marshall has been writing about tech since 1998, contributing sage advice and odd opinions to all kinds of magazines and websites as well as writing more than a dozen books. Her memoir, Carrie Kills A Man, is on sale now and her next book, about pop music, is out in 2025. She is the singer in Glaswegian rock band Unquiet Mind.