Facebook’s public relations department is apparently letting Marc Andreessen speak for the company this weekend.
Yesterday, news broke that Facebook had conducted a massive experiment — wherein the social network’s leadership decided to mess with the news feeds of Facebook users to see if it affected their mood. But that didn’t sit well with some people, while others like seasoned investor Andreessen didn’t see anything wrong with the practice.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1499428,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,social,","session":"C"}']“Run a web site, measure anything, make any changes based on measurements? Congratulations, you’re running a psychology experiment!,” Andreessen tweeted late Saturday.
Andreessen argues that websites and web publications test the emotional effects of content all the time. This, he believes, is how sites improve.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Many of Andreessen’s followers agree with him, too. “Agree. Can’t tell if people are more upset at the study itself or if they merely thought Facebook was more ethical,” says one.
“The townspeople with their pitchforks have no time for your A/B shenanigans,” says another, with tongue in cheek.
But the notion that Facebook’s mood experiment is the same as studying audience response is arguable. It’s one thing to stand back and objectively measure audience reaction to this or that. It’s quite another to apply one’s finger to the scale, actively manipulating the emotional responses of users — for its own sake.
Facebook has a responsibility to allow people to control the tone of their own news feeds, to let them find their own place on a scale between “the straight story” and “rose colored glasses.”
Like it or not, a social network is a repository for very personal and emotional content. That’s why the user data collected there is so valuable to Facebook’s advertiser partners. It’s the reason that Facebook is making billions in revenues.
It’s insulting that Facebook would mess with the feeds of users without telling them, but this is really about the question of control vs. manipulation.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1499428,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,social,","session":"C"}']
Facebook has always sought ways to make the site a “positive” experience. That’s why there’s no “thumbs down” button. This recent news feed experiment raises the question of how far it’s willing to go to make the service more positive.
Another Twitter user responded to Andreessen’s tweet, saying: “My concern is what they saw if they could induce negative reactions. Why make people feel worse?”
There may be a very good reason. Darkening some users’ moods might be good business. A study conducted last year showed that women subjects may be more susceptible to certain kinds of advertisements when they feel less self-confident, and more vulnerable.
VentureBeat reached out yesterday to the three researchers who did Facebook’s study. Only one, UCSF post-doctoral fellow Jamie Guillory, responded — saying that the lead researcher on the study, Facebook’s Adam Kramer, would be handling all enquiries.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1499428,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,social,","session":"C"}']
Facebook’s public relations team has yet to respond.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More