Over the weekend, reports surfaced that Facebook has been manipulating their 'Trending News' section to make certain stories appear more or less prominently, dependent on the direction of Facebook's editorial team.
The claims stem from a report on Gizmodo in which they interviewed a group of former contractors who worked on Facebook's 'trending news' team - a group of around 12 or so young journalists who select which news stories get featured in Facebook's prominently featured 'Trending' news section.
In the report, the former staffers described their role, saying that their duties consisted of reading through a list of trending mentions each day, as highlighted by Facebook's algorithm. The team would then connect those mentions to relevant, current news stories and work out which were the biggest issues of discussion across the platform, then write headlines for each of the topics, along with a three-sentence summary.
"The news curator also chooses the "most substantive post" to summarize the topic, usually from a news website. The former contractors Gizmodo interviewed said they were asked to write neutral headlines, and encouraged to promote a video only if it had been uploaded to Facebook. They were also told to select articles from a list of preferred media outlets that included sites like the New York Times, Time, Variety, and other traditional outlets."
All of that makes sense, at least to some degree, but the more troubling claims stem from the suggestion that Facebook's editors would regularly tamper with the news headlines, inserting stories they felt "should be" trending and blacklisting issues they didn't want on the network.
Those claims were further reinforced by additional former team members who came forward with similar details after the initial reports surfaced - one even shared a list of the stories that Facebook's editorial team blacklisted over time:
"Among the deep-sixed or suppressed topics on the list: former IRS official Lois Lerner, who was accused by Republicans of inappropriately scrutinizing conservative groups; Wisconsin Gov. Scott Walker; popular conservative news aggregator the Drudge Report; Chris Kyle, the former Navy SEAL who was murdered in 2013; and former Fox News contributor Steven Crowder."
Reactions to the report have been mixed - and it's worth noting that Facebook has strongly denied any claims of editorial tampering, saying that they have "rigorous guidelines in place for the review team to ensure consistency and neutrality".
But whether that's true or not, the discussion is worth having - Facebook plays a hugely influential role in the way we consume media and how we're kept informed about the happenings within our world. But Facebook's also an independent company, they can publish whatever they want. Given this, we need to look at the way the platform contributes to news circulation - and how they could, theoretically or not, use that position to gain advantage.
Changing the Vote
In December last year, Facebook CEO Mark Zuckerberg published a post on his Facebook Page in response to US Presidential Candidate Donald Trump's controversial call to ban Muslims from entering the United States.
At the time, BuzzFeed journalist Alex Kantrowitz raised a question about Zuckerberg's opposition to Trump, noting that Facebook could, if it wanted to, make it much harder for Trump to actually take office:
"The company could remove Trump, or his posts, from the platform, and effectively become a censor of political speech. The company's statement, which said it's looking at this content on a case by case basis, already implies that this is an option."
In line with their latest claims in relation to editorial tampering, Facebook has said that it would never suppress Trump's voice on the platform or seek to interfere with the political process.
But the fact remains that it could.
Some of you may be reading this and thinking "yeah, but it's just Facebook, it's not the only source of news". And that's true, but Facebook's proven in the past that it is able to sway the political process significantly, even through the simple act of boosting a message.
Back in 2010, around 340,000 extra voters turned out to take part in the US Congressional elections because of a single election-day Facebook message. This is not speculation, this is based on research.
The process they used to determine this was relatively simple:
"About 611,000 users (1%) received an 'informational message' at the top of their news feeds, which encouraged them to vote, provided a link to information on local polling places and included a clickable 'I voted' button and a counter of Facebook users who had clicked it. About 60 million users (98%) received a 'social message', which included the same elements but also showed the profile pictures of up to six randomly selected Facebook friends who had clicked the 'I voted' button. The remaining 1% of users were assigned to a control group that received no message."
The results of the test showed that users who received the informational message (the top message in the above screenshot) voted at the same rate as those who saw no message at all, while those who saw the social message - with images of their friends included (lower example in above screenshot) were 2% more likely to click the 'I voted' button, and 0.4% more likely to head to the polls than the either group. Researchers estimated that the social message directly increased voter turnout by 60,000 votes, while a further 280,000 people were "indirectly nudged to the polls" by seeing messages in their News Feeds - notifications that their friends had voted.
When looking at those numbers, the percentage results seem minor - 0.4% of people being more likely to vote is not really a meaningful proportion - but when that's framed against the scale of Facebook, 0.4% of the more than networks more than 1.65 billion monthly active users could end up changing the outcome. In the final results of the 2010 US Congressional election, upon which this research was based, the Republican Party regained control of the chamber, winning the popular vote by a margin of more than 5.8 - million - so in context, the addition of 340,000 extra voters may not appear significant. But this wasn't a wide-scale effort to influence the actual result, this was a test to see if Facebook could, if it wanted to. And the data shows it's totally possible.
While messing with the Trending section may not seem like a major concern - it's something you don't really pay attention to anyway, and no one's only going to rely on Facebook for news content, right? But the content that gets shown on Facebook, and how it gets shown, can have a significant impact.
And you likely wouldn't even realize it.
And of course, that's not the only example of emotional manipulation Facebook's been accused of - back in 2010, Facebook came under fire for manipulating the News Feeds of nearly 700,000 users, showing them an abnormally low number of either positive or negative posts in order to determine whether the content they were served on the platform could alter their emotional state.
"...for people who had positive content reduced in their News Feed, a larger percentage of words in people's status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred."
As per Facebook's research, this was the "first experimental evidence for massive-scale emotional contagion via social networks".
In other words, the first definitive evidence that what you see on social has a direct impact on your emotional state.
Facebook was strongly criticized for this research, with opponents saying they'd basically used people as lab rats and subjected some to a mild form of mental torture, purely on their own whim. The research set a bad precedent and shone a light on the potential for Facebook to manipulate emotional states - and if Facebook's research team is able to change how you feel, it stands to reason that they could also change how you think, particularly through the curation or selection of news content and what to show you.
Facebook's now serving more than a billion people per day, with users spending more than 50 minutes per day, on average, across Facebook, Instagram and Messenger. That's a lot of time to show them different messages - that's actually more time than people spend on any other leisure activity, outside of watching TV. More time than people spend reading. More time than people spend attending social events.
If you don't think Facebook has an influence over what you think - either directly or indirectly - you're wrong. And as such, it's important to consider the consequences of any type of manipulation or re-classification of the information presented to users on the platform.
The Future of News?
These latest claims become even more troubling when you consider Facebook's efforts to become a more prominent player in the news and media landscape. Earlier this year, Facebook announced the expansion of their 'Instant Articles' program - Instant Articles are posts published direct to Facebook, as opposed to linking users off to an external site. This delivers a better user experience, with Instant content loading 10x faster via regular mobile sites and with additional, mobile-friendly, presentation features. But it also means publishers need to sacrifice their own traffic and trust that Facebook will provide them with adequate revenue share to replace it. Thus far, publishers seem happy with the arrangement, but there's a great deal of trepidation. Most are still very wary about building a reliance on Facebook traffic - we all know what happened when Facebook's News Feed algorithm made it harder for Pages to generate organic reach.
If Facebook's to achieve this, and become a bigger part of the news and information landscape, then it needs to be seen as impartial and neutral, as having no part in pushing its own agenda or controlling the news message. There's a direct parallel here between this and Facebook's efforts to expand internet access to poor regions in India - while on one hand Facebook would be providing internet access to millions of people who don't currently have it, they'd also be controlling that access, which means they could effectively cut out other players who, maybe, offers services that compete with their own. This is the main reason why India stopped Facebook's Free Basics program from going ahead.
But control of the media, and of the news content we consume, could arguably be more significant than restricting internet access. By manipulating the message, Facebook has the power to change sentiment, to make movements or destroy them. There's already speculation, on the back of these reports, that Facebook may have played an influential part in fueling the 'Black Lives Matter' movement, making it a bigger political issue by inserting the topic into the Trending section, even when it wasn't actually trending, based on on-platform discussion.
Even if the cause, in itself, is justifiably worthy of more attention, the question remains as to whether Facebook should be in a position to make that call. For example, no doubt most people believe the Black Lives Matter movement is important and something that raises important issues. But what if Facebook had ignored it - would it still have become as significant a political movement? And if that were the case, would that have been more representative of the actual public interest?
What if there were another, more divisive issue rising and Facebook chose to promote it in the Trending section, which then lead to confrontations, riots - actions that maybe wouldn't have happened if Facebook hadn't pushed it? What if there were widespread criticism of Facebook and none it ever appeared in the Trending section, tempering public sentiment?
This it becomes even more relevant when you consider that, according to Pew Research, more Americans than ever are getting their news via Facebook.
Oh, and also, Mark Zuckeberg just recently built a contingency plan into his Facebook holdings that would enable him to retain control of the company if:
"...it were in connection with his serving in a government position or office."
Zuckerberg for President? Given the suggestions of the above noted research, and the growing influence of Facebook, maybe he'd be the most popular President in history. Whether that was reflective of actual public opinion or not.
However you feel about it, it's an issue that's worth consideration - should a platform as influential as Facebook be allowed to operate free of regulation in terms of news bias?
And if not, how can such concerns be addressed?
UPDATE: Facebook has addressed concerns about their Trending section both on their blog and via interviews with several major outlets. Facebook denies any bias or tampering with news content, but a document has also been leaked which outlines exactly how staff working on Facebook's Trending section work - noting that they can, in fact, manually inject news stories and influence their relevance.
Main image via Pixabay