Last July, there was a media firestorm when Facebook was caught manipulating user emotions with newsfeed experiments. The data science team wanted to understand how positive and negative stories impacted the way users interacted with content and Facebook itself. Since the uproar, a Facebook spokesman confirmed to VentureBeat that it has, in fact, chosen to stop some experiments, including a massive study to see whether the social network can encourage voting among its young audience.
In 2010, one of the nation’s leading political scientists, Professor James Fowler of the University of California, San Diego, conducted a massive experiment to see if placing an “I voted” counter at the top of the newsfeed had a measurable impact on election turnout. Since the “I voted” counter was randomized and the team had access to actual voter records, Fowler’s team could reliably see how exposure to certain messages influenced voting.
As a former political scientist, I would argues that this the most important academic study on the value of social media in democracy to date. Facebook managed to boost turnout 2.2 percent, which is really large by traditional election standards. Previous research found that is nearly the entire effect of the 2008 Obama campaign. If that seems small, it’s because most elections are won at the margins; campaigns spend millions just trying to get a few thousand more voters to the polls.
Facebook continued to conduct more experiments from the 2012 elections as well. But, for the upcoming 2016 election, it’s stopped. The service will show everyone the exact same button (or no button at all). This means it won’t know how much it’s encouraged people to vote, or if some messages are more effective than others. In other words, Facebook can no longer innovate.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Facebook’s answer is pretty non-responsive. In a statement, Facebook wrote:
“Voting is a core value of democracy and we believe that encouraging civic participation is an important contribution we can make to the community. We have learned over the past few years that people are more likely to vote when they are reminded on Facebook and they see that their friends have voted. We’re proud of this.”
The company continued (shortened for brevity):
“We first offered an ‘I’m a Voter’ button in 2008 to increase awareness about elections in the U.S. We did similar work in 2010 and 2012 and publicly released a paper in the journal Nature about the 2010 results. We plan on publishing an additional paper on the 2012 results. In these elections we have worked to improve people’s experience like we do with every Facebook product.”
The only real explanation I can imagine is that it’s scared of the press. It’s not that difficult to randomly assign users certain messages. It could completely replicate its previous work without much effort.
Unfortunately, Facebook was front-page news for a week when its more controversial news feed experiment was discovered last July. Since then, it evidently doesn’t want to anger critics any more, even for experiments that would probably be quite popular (the election experiment in 2010 received moderate fanfare and little criticism).
Facebook’s decision is disappointing. A mission-driven company like Facebook shouldn’t let the loudest critics control its decisions.
You can read more about the original report on Facebook’s decision here at Mother Jones.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More