As Facebook works to better manage the role it plays in the distribution of questionable content - particularly politically-motivated material and misleading and/or false reports - The Social Network has this week outlined its plans for a new, independent content review board which will provide an updated framework for platform posting regulations, and a new way for users to appeal content decisions.
As outlined in the draft charter:
"Every day, teams at Facebook make difficult decisions about what content should stay up and what should come down. As our community has grown to more than 2 billion people, we have come to believe that Facebook should not make so many of those decisions on its own - that people should be able to request an appeal of our content decisions to an independent body."
Spearheaded by former British MP Nick Clegg, the new board will be made up of around 40 experts "with experience in content, privacy, free expression, human rights, journalism, civil rights, safety and other relevant disciplines". Facebook has committed to maintaining a public list of all the board members, and any content decisions made by the group will be enforced by full-time Facebook staff, who are not members of the board themselves.
Ideally, the new group of experts will be able to help Facebook better manage content decisions, particularly relating to more broad-ranging regulations on what is and is not acceptable on the platform. Facebook has long sought to distance itself from this element, noting repeatedly that it is not a media company, and therefore not in the business of providing editorial insight. But when you're in charge of the largest, and arguably most influential, platform in the world, there comes a certain level of responsibility over what you allow to be shared, and hosted on your servers.
The development of an independent board enables Facebook to provide a level of censorship, with input from recognized leaders in the field - as opposed to Zuck and Co. making a call themselves, and opening themselves up to criticism from impacted groups.
Of course, much of that criticism will still come anyway - those who post content which could potentially be impacted will question who decides who should be appointed to this group, how the decisions are being made. How Facebook will seek to silence conservatives.
Unfortunately, there's no perfect way around this - at least with this approach, Facebook can add a higher level of transparency, and distance from any such decisions.
As noted in the announcement post:
"[The board] will be obligated to the people who use Facebook - not Facebook the company"
This is always going to be a difficult area, and Facebook - along with every other online network - has struggled for years to strike the right balance between allowing free speech and maintaining some boundaries. Social networks have long argued that they should not be the ones in charge of setting those parameters, but there are certain obvious points where they've been forced to intervene - for example, criminal content and hate speech. But even then, where do you draw the line on those subjects? At what point does something cross the line from acceptable and into the grey area before becoming a clear violation?
That's what Facebook's Oversight Board will now rule on. It'll never be a perfect system, but it does distance Facebook itself, at least somewhat, from making those tough rulings.
The Social Network's planning to run a series of workshops around the world over the next six months, where it will "convene experts and organizations who work on a range of issues such as free expression, technology and democracy, procedural fairness and human rights". Once those discussions are complete, Facebook will move forward with the next phase of the project - you can review draft charter for yourself here.
Along a similar line, Facebook has also announced an expansion of its efforts to protect democratic elections, with new regulations coming into effect in various regions to stamp out voter interference.