controversial study that manipulated users’ newsfeeds was not
pre-approved by Cornell University’s ethics board, and Facebook
may not have had “implied” user permission to conduct the study as
researchers previously claimed.
the study, researchers at Facebook tweaked what hundreds of
thousands of users saw in their news feeds, skewing content to be
more positive or negative than normal in an attempt to manipulate
their mood. Then they checked users’ status updates to see if the
content affected what they wrote. They found that, yes, Facebook
users’ moods are affected by what they see in their news feeds.
Users who saw more negative posts would write more negative things
on their own walls, and likewise for positive posts.
a refresher on the controversy, check out The
Washington Post’s story from Monday).
board consulted after the fact
reported by The Post and other news outlets, Princeton University
psychology professor Susan Fiske told
the Atlantic that an independent ethics committee,
Cornell University’s Institutional Review Board (IRB), had
approved use of Facebook’s “pre-existing data set” in the
experiment. Fiske edited the study, which was published in the
June 17 issue of Proceedings of the National Academy of Sciences.
A statement issued
Monday by Cornell University clarified the experiment was
conducted before the IRB was consulted. A Cornell professor,
Jeffrey Hancock, and doctoral student Jamie Guillory worked with
Facebook on the study, but the university made a point of
distancing itself from the research. Its statement said:
Hancock and Dr. Guillory did not participate in data collection
and did not have access to user data. Their work was limited to
initial discussions, analyzing the research results and working
with colleagues from Facebook to prepare the peer-reviewed paper
“Experimental Evidence of Massive-Scale Emotional Contagion
through Social Networks,” published online June 2 in Proceedings
of the National Academy of Science-Social Science.
the research was conducted independently by Facebook and
Professor Hancock had access only to results – and not to any
data at any time – Cornell University’s Institutional Review
Board concluded that he was not directly engaged in human
research and that no review by the Cornell Human Research
Protection Program was required.
consent called into question
claimed the fine print users agreed to when they signed
up was tantamount to “informed consent” to participate in the
study. Facebook’s current data use policy says user information
can be used for “internal operations” including “research.”
However, that’s not what it said in 2012 when the study was
conducted. According to Forbes:
January 2012, the policy did not say anything about users
potentially being guinea pigs made to have a crappy day for
science, nor that ‘research’ is something that might happen on
months after the study, in May 2012, Facebook made
changesto its data use policy, and that’s when it
introduced this line about how it might use your information:
‘For internal operations, including troubleshooting, data
analysis, testing, research and service improvement.’ Facebook
helpfully posted a ‘red-line’
version of the new policy, contrasting it with the prior
version from September 2011 — which did
not mention anything about user information being used in
someone signs up for Facebook, we’ve always asked permission to
use their information to provide and enhance the services we
offer,” a Facebook spokesman told Forbes. “To suggest we conducted
any corporate research without permission is complete fiction.
Companies that want to improve their services use the information
the word ‘research’ or not.”
revelation will likely further rile critics already angered that
Facebook fell short of the standards
imposed by the government and professional associations for
informed consent in studies conducted on humans. Informed
consent involves disclosing information about the study before it
takes place and giving subjects a chance to opt out – and Facebook
did neither. Since Facebook is a private company, it isn’t held to
those standards, according
to legal experts interviewed by the International Business Times,
but that hasn’t stopped some from feeling violated and angry.
international headlines are an accurate gauge of public opinion,
people worldwide are angry at Facebook. Here’s a sampling,
translated badly by Google Translate:
Newsfeeds: Facebook, das permanente Psycho-Experiment
Newsfeeds: Facebook, the permanent Psycho-Experiment
nás Facebook za laboratorní krysy? Experiment vyvolal bouřlivé
has for us lab rats? Experiment provoked strong reactions
O grande problema do Facebook? Cegueira ética
The big problem of Facebook? ethical blindness
svarer på massiv kritikk
responds to massive criticism
ביצעה ניסוי ברגשות של משתמשים, ועכשיו הם כועסים
has made trial users’ emotions, and now they’re angry
patkányokként kezeli a Facebook felhasználóit? Háborognak az
patkányokként manages the Facebook users? Indignation of the
“ammette” di manipolare l’umore degli utenti
“admits” to manipulate the mood of the users
इमोशंस से खेल रहा था फेसबुक, न्यूजफीड से की छेड़छाड़!
was playing with your emotions, feed tampered with!
emotions are controlled face book yet?
utilisateurs de Facebook « manipulés » pour une expérience
users “manipulated” for a psychological experiment