We’re sorry.
That was the response of Facebook’s Sheryl Sandberg, who for the first time publicly addressed the raging firestorm about the secret study conducted on 700,000 unknowing users by injecting their feeds with negative information to see if it made them feel, like, sad.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1501384,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"security,","session":"A"}']At a meeting in India with potential advertisers, Sandberg, Facebook’s very public chief operating officer, was quoted in the Wall Street Journal as saying:
“This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated. And for that communication we apologize. We never meant to upset you.”
Incredibly, the results of the study appeared in the Proceedings of the National Academy of Sciences (PNAS). Facebook did not tell the 700,000 hapless guinea pigs that they were in the study, which tabulated the emotional reactions of some users who became sad and emotionally isolated when presented with dubious interjections on their news feeds.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Sandberg’s apology follows on the heels of one of the study’s researchers, Adam D.I. Kramer, who posted his missive on a Facebook blog on Monday. The research team said they wanted everybody using the social media site to enjoy the experience. Kramer’s opening to the post read:
“The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone.”
Just as Sandberg was explaining herself, others in the techsphere were shredding the social-media experiment as highly unethical and more than questionable. Computer scientist Dr. Andy Carle, 31, an expert in computer human interaction, blasted the report and went one further: He believes Facebook has more such anonymous studies in their pipeline.
Facebook did not get back to VentureBeat by press time.
At the very least, Carle said, those involved in the study should have been notified. He also pointed out that, in his opinion, Facebook didn’t itself know the names of those being targeted in what some have called a virtual dragnet. The study could have negative consequences, he said, for those who were left feeling bummed.
Carle pointed out had the study been undertaken in Western Europe, it would likely have invited lawsuits at the very least because privacy laws in some countries there are way more stringent than in the States.
“I have no doubt they will continue with these studies,” he said. “This worries me, the culture of user research at Facebook. At the very least they should have been notified. The question is, what else are Facebook doing now?
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1501384,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"security,","session":"A"}']
“There is a code of conduct in doing this kind of work. But Facebook wasn’t holding themselves to this kind of conduct.”
“This worries me. The culture of user research at Facebook
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More