So this could be problematic.
In the second installment of their explanations of coming changes to the News Feed algorithm, Facebook's detailed how it plans to rank news sites by trustworthiness, in order to determine whether they get a distribution boost, or penalty.
As explained by Mark Zuckerberg:
“The hard question we've struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you -- the community -- and have your feedback determine the ranking.”
Makes sense so far, I can see the logic of the approach.
“We decided that having the community determine which sources are broadly trusted would be most objective. Here's how this will work. As part of our ongoing quality surveys, we will now ask people whether they're familiar with a news source and, if so, whether they trust that source. The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who don't follow them directly. (We eliminate from the sample those who aren't familiar with a source, so the output is a ratio of those who trust the source to those who are familiar with it.)”
Okay, possible issue.
Most notably, there could be a problem with their testing methodology. Facebook’s previously noted that they have a News Feed ‘Quality Panel’ of some 700 reviewers who provide real, human feedback on their News Feed results, which then help the data team make more informed decisions. They’ve also stated that they receive feedback from tens of thousands of users every day via user response surveys - in combination, you’d assume these two elements would comprise the pool of feedback they’ll be working with to determine content trustworthiness.
But who are these people, exactly? Where are they from?
According to Facebook News Feed chief Adam Mosseri:
“We surveyed a diverse and representative sample of people using Facebook across the US to gauge their familiarity with, and trust in, various different sources of news. This data will help to inform ranking in News Feed.”
So, cool, right? The program's rolling out in the US only, initially, and a pool of US reviewers have provided their feedback on who’s trustworthy and who’s not.
Except, it still feels a little questionable.
While the logic of Facebook’s approach does make sense – on a wide enough scale, you’d think that clear trends would emerge, even if smaller sub-groups could, potentially, skew the results. But there are some outlets that may also suffer unfairly. Is BuzzFeed, for example, a trustworthy publisher? I think they produce some top quality journalistic content, but they also publish random surveys and pop-culture quizzes. If BuzzFeed were on your list and you were asked to give them a level of trust, what would you say?
There may be quite a few publications in this category, publishers who’ve actually created content in line with what Facebook’s News Feed has previously rewarded, who are now set to be penalized for the same, as, in some ways, it may have diluted the news value of their brand.
There are also some positives to that – sensationalism and click-bait style headlines which get people talking, but are ultimately revealed as hollow, may have scorned enough readers that they now place less trust in a publisher, and thus, their distribution goes down. But again, those publications are only working with what Facebook has previously rewarded. It seems a little unfair to reduce their reach because of it.
But this is the new Facebook, and the push against divisive and sensational content does make sense – in fact, it’s one of the criticisms I had of the recently announced News Feed changes, that it would promote content which gets people talking, which tends to be this type of sensationalized, polarizing material, purely created to drive debate.
This update could counter this, at least to some degree, though as with the broader revisions, it’s impossible to know until we see them actually rolled out.
And on that, there is one particularly relevant detail worthy of highlighting in Mark Zuckerberg’s latest post:
“Last week I announced a major change to encourage meaningful social interactions with family and friends over passive consumption. As a result, you'll see less public content, including news, video, and posts from brands. After this change, we expect news to make up roughly 4% of News Feed - down from roughly 5% today.”
A 1% shift is significant, particularly at Facebook’s scale. But it’s not the end of news and Page content as we know it.
Mosseri notes that US publications deemed trustworthy by people using Facebook may see an increase in their distribution.
"Publications that don’t score highly as trusted by the community may see a decrease."
The next question will be whether Pages are informed of their ‘trustworthiness’ so they can improve their standing.