Facebook Scientist Who Secretly Manipulated People's Emotions: My Bad

Facebook Scientist Who Secretly Manipulated People's Emotions: My Bad

Facebook Scientist Who Secretly Manipulated People's Emotions: My Bad
Read: 3217 times \

The Internet erupted in anger this weekend after discovering that Facebook attempted to manipulate the emotions of hundreds of thousands of people in 2012 as part of an experimental look into the power of its News Feed.

Coincidentally, the lead researcher of the controversial Facebook project is suddenly having some second thoughts about the whole thing.

Adam Kramer, the data scientist at Facebook in charge of the study, wrote a mea culpa of sorts on Sunday afternoon. In it, he said he never intended to hurt any of the 700,000 people affected by the study, which looked at how one's emotional state was altered by changes in News Feeds -- the main stream of status updates and photos seen when Facebook is first opened.

"I can tell you that our goal was never to upset anyone," Kramer wrote on Facebook. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."

Kramer's major regret is that he and his colleagues didn't adequately explain why they conducted the study in the first place. The reason, he said, was that they were trying to make Facebook into a better product by better understanding the people who use it.

You can read his entire note here (or at the bottom of this post):

 

For a week in January 2012, Facebook tweaked the frequency of posts deemed "positive" or "negative" in test subjects' News Feeds. The researchers found that the group who saw more positive posts wrote more effusively on Facebook a few days later, which seemed to confirm the hypothesis that emotions are "contagious" on social networks.

Facebook published its findings in the Proceedings of the National Academy of Sciences two weeks ago, and the reaction was swift once it was picked by the media. Critics jumped on the company for not informing Facebook users they were part of an experiment -- especially since negative emotions were shown to be contagious, too. Others shrugged off the study, saying it was simply par for the course for Facebook.

But the further people dug, the more problems they found with the way Facebook performed its research. The complaints fell into three main buckets:

1. IT WAS BAD SCIENCE: Supposedly, Facebook did a bad job at measuring the very basic thing it was trying to measure: mood. Writing at Psych Central, psychologist John Grohol noted that the study infers someone's emotional state by counting number of predetermined "positive" and "negative" words. So an obviously depressive snippet of text like "I'm not having a great day" will actually be read as positive since it says "great day."

2. OTHER SCIENTISTS CAN'T COPY IT: Other scientists can't replicate the research -- a must in science, where things need to be tested again and again. Why? Because Facebook keeps its (i.e., your) data under lock and key. Remember, Facebook isn't a college; it's a business.

3. IT WAS A LITLE BIT UNETHICAL: Facebook failed to be meet a basic ethical standard used at universities everywhere. While everyone on Facebook consented to having their data used in research when they signed up, Facebook didn't get what's called "informed consent" from anyone. James Grimmelmann, a law professor at the University of Maryland, wrote that would have required Facebook to tell participants the duration, purpose, length and risks of the study before it started -- and that they could have backed out of the study at any time.

In his note, Kramer wrote that he and his team has "come a long way" since 2012, and that they will take the negative reactions to heart.

"While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices," he wrote.

Here's Kramer's note if the embedded Facebook post above doesn't work for you:

OK so. A lot of people have asked me about my and Jamie and Jeff's recent study published in PNAS, and I wanted to give a brief public explanation. The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper.

Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody's posts were "hidden," they just didn't show up on some loads of Feed. Those posts were always visible on friends' timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses is.

And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it -- the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.

The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper.

huffintonpost

Download the Howwe Music App
Howwe App

Kagwirawo