Facebook has this week mapped the next stages for the implementation of its independent Content Oversight Board, which will help the platform make more informed decisions on what should, and should not be allowed to be shared across its network under its rules.
The Content Oversight Board, which Facebook has been working on over the last year, will review cases where users disagree with initial content decisions made by Facebook - whether that's resulted in content being removed or left up after their complaint. Some of these cases, according to Facebook, "go to the core of how we balance safety and free expression", and as such, the Oversight Board will provide additional, independent review, and recommend subsequent action to Facebook, one way or another.
But there will be some limitations within this.
As per the Board's bylaws, while the Board will recommend actions to be taken, and Facebook will be committed to actioning such based on the Board's notes, Facebook won't be obliged to make subsequent policy changes based on the same findings.
As per Facebook:
"Facebook is committed to implementing the board’s decision on individual pieces of content within seven days, as outlined in the bylaws. Facebook will also assess the technical and operational feasibility of applying the decision to identical content with parallel context, as explained in the bylaws."
So Facebook will look to apply the Board's findings to other, similar cases. But it won't be obliged to take action beyond the initial report.
"When the board provides an additional policy recommendation, Facebook will review that guidance. Some recommendations may involve only minor modifications to current policies or practices, while others may involve more substantial or complex changes. The latter will go through our full policy development process or other appropriate channels. This will allow for a thorough and considered analysis of the proposed policy recommendation, as well as additional stakeholder engagement."
So the Board can say 'this is not acceptable, the platform rules need to be changed', and Facebook can nod in agreement. Then move on, without changing anything. Facebook is not obliged to make rule changes based on the Board's assessment.
Facebook will, however, need to provide a public response regarding any policy recommendations and follow-on action within 30 days. That's an important step, but it does dilute the broader powers of the Board to a degree, while the Board will also be limited to reviewing only content that's been taken down by Facebook in its initial stages, in order to ensure it can manage the workload required.
"Given the large number of content decisions Facebook makes, as well as the time it will take to hear cases, we expect the board will choose cases that have the greatest potential to guide Facebook’s future decisions and policies. We expect the board to come to a case decision, and for Facebook to have acted on that decision, in approximately 90 days."

Facebook will also have the capacity to alter the Board's bylaws (to a degree):

So, not exactly independent. A step in the right direction, clearly, but the bylaws suggest that Facebook will still have some level of control over how such rulings are actioned, and how the board is governed more broadly.
But then again, that may also, in some ways, be the point.
As we head into an election year, Facebook knows that it's going to come under intense scrutiny over its content decisions - what it decides to leave up (like, say, lies in political ads) and what it takes down due to violations of platform rules.
Each of these actions can have a significant impact, especially when you consider the reach and influence of the platform. Now, some 68% of U.S. adults get at least some of their news content via social media apps, with Facebook the leading provider of such by a significant margin. In fact, Americans are now more likely to stay informed on the latest issues via social media, as opposed to newspapers, and when you also consider that Facebook's own research has shown that it has the power to influence the outcome of elections, it's clear that Facebook's content decisions are huge, and can play a major role in informing, or misinforming, the public.
That's why Facebook has appointed the Content Oversight Board, to help ensure that any decisions it makes, and the subsequent actions it carries out, are independent. These decisions and directives are not coming from Facebook, as such, they're coming from external experts, who will govern case-by-case decision-making on the more complex content cases, alleviating The Social Network of some responsibility, while also, ideally, ensuring better outcomes, detached from what's seen as being better for the company itself.
But these latest regulations don't exactly imbue a sense of confidence that the Oversight Board will fix everything in this respect.
But then again, it's not meant to - the project is not intended to be a bottom-line solution that will solve all of the varying conflicts and approaches and keep the platform clear, independent and address all such concerns.
In fact, one could argue that the board is intended to fail, in order to demonstrate just how difficult it is to combat misinformation and make definitive calls on content moderation and the parameters of free speech.
Facebook CEO Mark Zuckerberg alluded to this in his recent 2020 'personal challenge' update, in which he explained the complex issues that Facebook is facing, and the even more complex way forward in addressing such concerns:
"Platforms like Facebook have to make tradeoffs on social values we all hold dear - like between free expression and safety, or between privacy and law enforcement, or between creating open systems and locking down data and access. It's rare that there's ever a clear "right" answer, and in many cases it's as important that the decisions are made in a way that feels legitimate to the community. From this perspective, I don't think private companies should be making so many important decisions that touch on fundamental democratic values."
Zuckerberg's 'solution', which he expects to see become a key focus over the next decade, is external regulation on what's acceptable, independent of Facebook, and other social platforms:
"One way to address this is through regulation. As long as our governments are seen as legitimate, rules established through a democratic process could add more legitimacy and trust than rules defined by companies alone. There are a number of areas where I believe governments establishing clearer rules would be helpful, including around elections, harmful content, privacy, and data portability. I've called for new regulation in these areas and over the next decade I hope we get clearer rules for the internet."
In that same overview, Zuckerberg specifically points to Facebook's Oversight Board, which, Zuck says, is an example of a community governing itself.
"An example of independent governance is the Oversight Board we're creating. Soon you'll be able to appeal content decisions you disagree with to an independent board that will have the final decision in whether something is allowed. This decade, I hope to use my position to establish more community governance and more institutions like this. If this is successful, it could be a model for other online communities in the future."
Essentially - and depending on how you read this - Zuckerberg is saying that the ultimate solution will be for a broader ruling on what is and is not acceptable in online communications, which would be applied across the board, to all platforms, but a smaller scale option could also be in independent governance boards like this. If it works.
The impression I get is that Zuckerberg is not confident that this will, indeed, work, but it will serve as an example of why broader oversight and rulings are needed. Because even an 'independent' board has to be appointed by someone, which will mean that, invariably, the board will still have ties to the platform itself, in some capacity.
That's clearly the case here - Facebook still holds significant sway, and significant influence over the final decisions, and will make calls on any broader actions, depending on how it interprets each decision. That's an inescapable clause built into this system - the solution, as Zuck notes, is for a fully independent oversight group to do the same across all apps, and set definitive, binding rules, taking the responsibility away from them.
That would take a lot of heat off Facebook - while conversely adding a lot to whomever this new group may be. Would it be national or international? And if it's international, would that mean culturally sensitive exemptions would be applied in different regions or globally?
Basically, Facebook's Oversight Board looks set to ultimately become an experiment which demonstrates the limitations of any approach that is not fully independent. And until a decision is made on a fully independent group, those limitations will remain.