Facebook toying with your feelings: what does the experiment mean for you?
Why the internet's furious about News Feed fiddling
Is this a one-off?
No. As the Wall Street Journal reports, Facebook's data scientists get up to all kinds of tomfoolery - including locking a whole bunch of people out of Facebook until they proved they were human. Facebook knew they were: it just wanted to test some anti-fraud systems.
Is there a conspiracy theory?
Is AOL a CIA front? Of course there is. Cornell University, which worked with Facebook on the study, originally said that the US Army's Army Research Office helped fund the experiment. That has now been corrected to say that the study "received no external funding", but the internet is awash with tales of military involvement.
That isn't as far fetched as it sounds. The US created a "Cuban Twitter" to foment unrest in Cuba, and as Glenn Greenwald reports, security services are all over social media: "western governments are seeking to exploit the internet as a means to manipulate political activity and shape political discourse. Those programs, carried out in secrecy and with little accountability (it seems nobody in Congress knew of the 'Cuban Twitter' program in any detail) threaten the integrity of the internet itself."
Facebook is no stranger to manipulating public opinion. In 2010, it encouraged an estimated 340,000 additional people to get out and vote by subtly changing the banners on their feeds. As Laurie Penny writes in the New Statesman, that gives Facebook enormous power: "What if Facebook, for example, chose to subtly alter its voting message in swing states? What if the selected populations that didn't see a get-out-and-vote message just happened to be in, say, majority African-American neighbourhoods?"
Is it time to make a tinfoil hat?
Get the best Black Friday deals direct to your inbox, plus news, reviews, and more.
Sign up to be the first to know about unmissable Black Friday deals on top tech, plus get all your favorite TechRadar content.
Probably not. A few changes to news feeds is hardly the distillation of pure evil, and it's clear that the study is acting as a lightning rod for many people's loathing of Facebook. However, the controversy should be a reminder that Facebook is no mere carrier of your information: it actively intervenes in what you see, using algorithms to present you with what it thinks will encourage you to spend the most time using the service.
That's very different from rival services such as Twitter, which show you everything and let you decide what matters and what doesn't, and the difference between your news feed in chronological view (if you can find it) and Top Stories view is dramatic.
Here's a conspiracy theory we can get behind: as with most free services on the internet, Facebook's users are its product and its customers are advertisers. Would Facebook deliberately manipulate the emotional content of your news feed to make you more receptive to advertisers' messages? And if it did, how would you know?
- What was that about ads? Facebook acquires advertising platform LiveRail
- 1
- 2
Current page: How deep does the manipulation go?
Prev Page What does Facebook's experiment mean for you?.Writer, broadcaster, musician and kitchen gadget obsessive Carrie Marshall has been writing about tech since 1998, contributing sage advice and odd opinions to all kinds of magazines and websites as well as writing more than a dozen books. Her memoir, Carrie Kills A Man, is on sale now and her next book, about pop music, is out in 2025. She is the singer in Glaswegian rock band Unquiet Mind.
This Black Friday MSI Katana A15 gaming laptop deal from Amazon proves you don't have to spend a fortune to get your game on
Can’t uninstall or update your Microsoft Store apps? Weird Windows 10 bug has just been fixed, thankfully
The end of Google Fit? Fitbit looks set to replace it on future Android phones – and bring its AI coach with it