Facebook's team of mad data scientists caught flak this summer for experimenting on their users' emotions. But the company has a proud history of turning people into unwitting research subjects. For at least six years, the social network has been mining the political preferences of its users without their consent. The company will pull personal political data for the 2016 election as well, but this time Facebook will share its findings.
According to Mother Jones, Facebook began secretly subjecting their users to various political experiments before the 2008 election. The primary tests seems benevolent enough: a prominent "I Voted" button to encourage people's friends to vote in elections. But the project, known as "voter megaphone," is drawing concern that 1.3 billion member network could influence elections.
Specifically, Facebook has shown that it is willing to manipulate what users see in their Newsfeed before the elections:
In particular, Facebook has studied how changes in the news feed seen by its users—the constant drip-drip-drip of information shared by friends that is heart of their Facebook experience—can affect their level of interest in politics and their likelihood of voting. For one such experiment, conducted in the three months prior to Election Day in 2012, Facebook increased the amount of hard news stories at the top of the feeds of 1.9 million users. According to one Facebook data scientist, that change—which users were not alerted to—measurably increased civic engagement and voter turnout.
Facebook officials insist there's nothing untoward going on. But for several years, the company has been reluctant to answer questions about its voter promotion efforts and these research experiments.
Emphasis added. Facebook's research into voter turnout mechanisms has led the company to conclude that they can get more people to the polls. One Facebook-conducted study suggested that the voter megaphone project increased turnout by 340,000 voters between 2006 and 2010.
Higher turnout pushed by Facebook is bad news for the Republican party:
There may be another reason for Facebook's lack of transparency regarding its voting promotion experiments: politics. Facebook officials likely do not want Republicans on Capitol Hill to realize that their voter megaphone isn't a neutral get-out-the-vote mechanism. It's not that Facebook uses this tool to remind only users who identify themselves as Democrats to vote—though the company certainly has the technical means to do so. But the Facebook user base tilts Democratic. According to the Pew Internet & American Life Project, women are 10 points more likely to use this social network than men; young people are almost twice as likely to be on Facebook than those older than 65; and urbanites are slightly more likely to turn to Facebook than folks in rural areas. If the voter megaphone was applied even-handedly across Facebook's adult American user population in 2012, it probably pushed more Obama supporters than Romney backers toward the voting booth.
What could concern politicians even more? Facebook told Politico it can now determine a person's "sentiments" regarding politicians and issues—raising the possibility that the company could run issue-based experiments. And Politico reports that ABC News and BuzzFeed will begin receiving this detailed user data to enhance their election coverage.
The data will be gathered from the posts of Facebook users in the United States 18 and older, classifying sentiments about a politician or issue as positive, negative or neutral. The data can also be broken down into sentiments by gender and location, making it possible to see how Facebook users in the key primary states of Iowa or New Hampshire feel about certain presidential candidates, or how women in Florida feel about same-sex marriage.
When pressed for a statement, a Facebook spokesperson assured Mother Jones that they would not be running turnout trials with voter megaphone in 2014. He also stated that past experiments were distributed in a way that was "entirely random" so no one party or candidate benefited unfairly.
But MoJo is suspicious of Facebook's intentions given its lack of transparency: "Facebook wants its users to vote, and the social-networking firm will not be manipulating its voter promotion effort for research purposes. How do we know this? Only because Facebook says so."