As Facebook has repeatedly stated, it's not a media company, which is why it really doesn't want to get into the practice of removing fake news reports and content - essentially an editorial function.
But Facebook does want to clean up its platform to ensure the best possible user experience, while also lessening the potential of its platform being used for political influence. So while The Social Network won't go all out and ban Pages sharing disputed content (yet), it is going to start implementing some new measures to crack down on two specific elements of annoyance and/or concern.
First off, Facebook's going to take a more proactive role in halting the spread of political misinformation for the US midterms as it relates to provable voter manipulation specifically.
As reported by Reuters:
"Facebook will ban false information about voting requirements, and fact-check fake reports of violence or long lines at polling stations ahead of next month’s U.S. midterm elections."
That's not exactly editorial influence over content on its platform, and Facebook has set some very clear parameters around how the process will work (which also lessens the workload for Facebook). But it will enable Facebook to exert some control over such misinformation - while there can be some debate over who decides what's fake news and what isn't, Facebook can crack down on verifiable information to ensure it lessens its potential contribution to political impacts.
It'll be interesting to see what the result of such efforts is - there's no way, of course, of definitively proving the influence of Facebook or any social network on voter behavior. But with reports of voter suppression tactics already in play, the distribution of such reports could be significant. Facebook may not want to police what's true and isn't in a broader sense, but it does what to ensure its users are informed with accurate, actionable reports.
In addition to this, Facebook has also announced a new News Feed algorithm update which will demote links to sites that re-publish and redistribute content without permission, and flood those pages with ads.
As explained by Facebook:
"Starting today, we’re rolling out an update so people see fewer posts that link out to low-quality sites that predominantly copy and republish content from other sites without providing unique value."
According to TechCrunch, Facebook will catch out sites that post duplicate content by implementing a new system which will compare the main text content of a page with all other text content to find potential matches. It's a similar, though less technical, system to Google's duplicate content detection process, which demotes pages in search results where a conflict is detected.
As noted, both changes provide increased flexibility for Facebook to start removing questionable content from its network. It's not editorial license, Facebook's not making any calls on what is and isn't true in general terms. But it does give Facebook a few more measures to pick and choose what it allows on its platform.
Is this another move towards full editorial control by Facebook? Not really, but then again the results of the upcoming midterms, and the subsequent debate over Facebook's role, could determine whether Zuck and Co. need to take that next step. If politically-motivated groups flood the network in the lead-up, with clearly fake, easily verifiable reports, and that's believed to have influenced voters, Facebook may be forced to take action.
But if its newly introduced measures, including ad transparency tools, third-party fact-checking on some reports, restrictions on 'issues ads', etc. If those measures work, and there's no question of Facebook's influence in the vote, that may be enough.
It'll certainly be an interesting review in the wash-up.