When you sign up for Facebook, you have to agree to a whole laundry list of fine-print terms and conditions (which almost nobody ever reads). One of the things you consent to is Facebook’s Data Use Policy, which gives Facebook the right to use your info for, “…troubleshooting, data analysis, testing, research and service improvement.”
Well, it seems that Facebook has taken full advantage of the “research” portion of that agreement. A study published two weeks ago in the Proceedings of the National Academy of Sciences (PNAS) revealed that Facebook recently carried out an experiment that involved manipulating user’s emotions.
Basically, Faceobook wanted to know if removing sad, angry or otherwise negative terms from a user’s News Feed would affect how happy or sad the statuses they posted were.
So they randomly selected 689,003 users and tweaked the computer algorithms that determine what pops up on your News Feed. Some of the users were fed primarily neutral to happy information and stories, while others were fed primarily neutral to sad or depressing information.
It probably comes as a surprise to nobody that the users who were fed more negative information tended to post more gloomy statuses.
Congratulations Facebook, you proved something that 99% of 5th graders could have probably just told you.
But what about all of the users who Facebook intentionally made sad? Some serious questions have been raised about the ethics of the experiment.
Any experiment that receives federal funding has to abide abide by a code of rules known as the Common Rule for human subjects. The Common Rule’s definition of consent requires the researchers to give the test subjects, “a description of any foreseeable risks or discomforts to the subject.”
Facebook clearly didn’t abide by that standard, but since the test wasn’t federally funded, they are technically exempt. However, the PNAS also has its own set of rules for publication. Unfortunately, they seem to have bent or broken a few of them to publish the Facebook study.
Most notably, PNAS‘s guidelines for publishing require that a study abide the principles of the Helsinki Declaration, which states that test subjects must be,
“…adequately informed of the aims, methods, sources of funding, any possible conflicts of interest, institutional affiliations of the researcher, the anticipated benefits and potential risks of the study and the discomfort it may entail.”
Clearly, manipulating the emotions of 700,000 oblivious users is a blatant violation of this principle. With most people getting the bulk of their news and information on Facebook, it’s pretty unsettling to find out that they’re doing mass psychological testing on us.
Read the original story from Slate here.