The Wall Street Journal’s ‘Facebook Files’ report, which details various leaked insights from inside Facebook HQ, has certainly sparked a whole new range of concerns for The Social Network.
Among the various issues outlined in the multi-part investigation was the notion that Facebook holds celebrities and high profile users to a different standard, with a separate moderation team double-checking their posts and updates, and potentially leaving them up when they would have been removed if they were from everyday people.
Facebook has previously outlined this double-checking process, which it claims is to ensure that the right decision is made, “so that [posts] are not mistakenly removed or left up”, while Facebook has also denied that the process offers any special treatment, as such, to these Pages.
But regardless, Facebook does also acknowledge that the system is not perfect, and this week, it’s referred a ruling on the process to its independent Oversight Board, in an effort to establish a better way forward in assessing such content.
As explained by Facebook:
“Facebook reviews billions of pieces of content every day, has 40,000 people working on safety and security, and has built some of the most sophisticated technology to help with content enforcement. Despite that, we know we are going to make mistakes. The cross-check system was built to prevent potential over-enforcement mistakes and to double-check cases where, for example, a decision could require more understanding or there could be a higher risk for a mistake. This could include activists raising awareness of instances of violence, journalists reporting from conflict zones or other content from high-visibility Pages and profiles where correct enforcement is especially important given the number of people who could see it.”
So, again, Facebook says that the process is more designed to ensure high profile, high impact mistakes are avoided, which is why it has a secondary check process in place – not to give celebrities more leeway for posting whatever they like.
Facebook says that it is always working to improve this process, and insights from the Oversight Board will play a part in this refinement.
“Holding Facebook accountable for our content policies and processes is exactly why the Oversight Board was established. Over the coming weeks and months, we will continue to brief the board on our cross-check system and engage with them to answer their questions.”
To be clear, the Oversight Board first called on Facebook to provide more insight into the cross-check process as a result of the WSJ report:
“At the Oversight Board, we have been asking questions about cross-check for some time. In our decision concerning former US President Donald Trump’s accounts, we warned that a lack of clear public information on cross-check and Facebook’s ‘newsworthiness exception’ could contribute to perceptions that Facebook is unduly influenced by political and commercial considerations.”
That refers to Facebook’s stance on comments posted by former President Trump, which Facebook chose not to take action on due to their newsworthiness and relevance to the community.
Indeed, in a speech to Georgetown back in 2019, Facebook CEO Mark Zuckerberg underlined this approach, which started the initial backlash in this respect:
“We don’t fact-check political ads. We don’t do this to help politicians, but because we think people should be able to see for themselves what politicians are saying. And if content is newsworthy, we also won’t take it down even if it would otherwise conflict with many of our standards.”
Facebook’s stance, as always, errs on the side of free speech, but more recent developments have forced a reassessment of this, and a broader re-consideration of the role that Facebook plays in communication, and its responsibilities on this front.
Which is why the revelations about its cross-check system stand out, because they do seem to align with Facebook’s clear preference to let more content be shared in its apps, and avoid having to police such through their own means.
Facebook’s broader view is that there should be some form of official regulation in the social media space, and that the platforms themselves should not have to establish such rules independently. Which is likely the better way to go, but thus far, there’s been little movement in establishing an independent oversight committee, outside of Facebook’s own efforts, which it’s using to illustrate the need for such.
Ideally, for Facebook, another regulatory group would take such decisions out of its hands, but right now, the responsibility remains with it, and each social platform, to rule on what’s acceptable and what’s not, and the specific parameters relating to such.
Which, given that these are independent businesses, reporting to their shareholders, doesn’t really seem like the best approach, especially as their influence grows every day.
The eventual answer seems, inevitably, to point to independent oversight, but regional variances and other complications do also pose significant challenges in this respect.
Which is why Facebook’s established its own Oversight board, and why it’s now looking to refer such decisions to them as a pressure release of sorts, while also alleviating responsibility on its team.
Which may seem like a quick way out in some respect - but really, it’s the model for where we should be headed.
Like it or not, Facebook’s own approach may be the best way to deal with its various concerns.