Over the course of a week in 2012, Facebook secretly experimented on 690,000 users by manipulating news feeds to highlight negative or positive posts. They did so in order to see whether it made people more or less happy. Unsurprisingly, they discovered that the more negative posts people were exposed to, the more unhappy and negative their own posts became.
Over the course of a week in 2012, Facebook secretly experimented on 690,000 users by manipulating news feeds to highlight negative or positive posts. They did so in order to see whether it made people more or less happy. Unsurprisingly, they discovered that the more negative posts people were exposed to, the more unhappy and negative their own posts became.
This means Facebook deliberately tinkered with the emotions of hundreds of thousands of its own users without their knowledge.
There was also apparently no effort to keep children out of the study.
Such manipulation is unethical and a clear warning to consumers about the frequently misunderstood relationship between social media companies and users.
In April, Facebook CEO Mark Zuckerberg told Wired magazine, “Our philosophy is that we care about people first.” Don’t believe it.
To companies that offer free services, you are not a customer — you are a commodity. This does not mean Facebook doesn’t value you; they do, in the same way a dairy farmer values his cows. To extend the analogy, Facebook and their partners are milking you for your information and your attention. You are livestock.
Time and again, Facebook has changed its platform to reveal more information than many users are comfortable with, and time and again they have apologized and made an effort to mitigate whatever policy is currently causing uproar. But this is the worst to date.
Privacy activist Lauren Weinstein tweeted: “I wonder if Facebook killed anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it’s possible.” She has a point; with hundreds of thousands of subjects, the sample certainly contained some very fragile people, none of whom consented to be emotionally toyed with.
Facebook tried to make it seem less creepy by pointing out that users give permission for this kind of thing when they sign up, but, according to Forbes, Facebook didn’t add “research” to its data-use policy until four months after the manipulation occurred.
Even if Facebook’s version had been changed prior to the study, service agreements are long and obscure to the point of absurdity. A software retailer famously inserted a clause in such an agreement on April Fool’s Day a few years ago that read, “By placing an order … you agree to grant us a nontransferable option to claim, for now and for ever more, your immortal soul.” Thousands of people agreed to hand over their souls before anyone noticed.
It has also been reported that the Cornell University Institutional Review Board granted ethical approval for the Facebook study. However, like the service agreement update, it was after the fact. The board approved use of an existing data set for study by Cornell researchers, not the human manipulation that was required to get the data in the first place.
Data collection may now be a permanent part of our lives, but users must be clearly informed what they are signing up for, and we should all remain vigilantly on the lookout for more of this kind of abuse.
— From the Orange County Register