This is an interesting move from Facebook.
In a long letter addressing Facebook's progress in tackling the platform's problems with user safety and the spread of misinformation, Facebook CEO Mark Zuckerberg has outlined a new News Feed algorithm update which aims to limit the incentive for Pages to share sensationalized content.
It's difficult to understand exactly how it's going to work in practice, but Zuckerberg explains it like this:
"One of the biggest issues social networks face is that, when left unchecked, people will engage disproportionately with more sensationalist and provocative content. This is not a new phenomenon. It is widespread on cable news today and has been a staple of tabloids for more than a century. At scale it can undermine the quality of public discourse and lead to polarization. In our case, it can also degrade the quality of our services."
Zuckerberg's right - the old adage of 'if it bleeds, it leads' has long been established in news publishing. But social media, and Facebook in particular, amplifies this to unprecedented levels.
But it's not an easy problem to solve - people engage with such content because it's divisive, because it's controversial. There's a reason why identities like Alex Jones are able to spark so much coverage.
So how can Facebook address this? Zuckerberg outlines a new News Feed algorithm update which will aim to reduce the reach of controversial content by limiting distribution for clearly sensationalized posts.
"Our research suggests that no matter where we draw the lines for what is allowed, as a piece of content gets close to that line, people will engage with it more on average -- even when they tell us afterwards they don't like the content. This is a basic incentive problem that we can address by penalizing borderline content so it gets less distribution and engagement. By making the distribution curve look like the graph below where distribution declines as content gets more sensational, people are disincentivized from creating provocative content that is as close to the line as possible."

In more basic terms, Zuckerberg's saying that rather than the current situation - where controversial, but not forbidden, content is rewarded with greater distribution (as per below graph), they want to change it so that reach actually declines the closer posts get to breaching the rules.

Zuckerberg provides an example of posts which get close to violating the platform's rules on nudity:
"For example, photos close to the line of nudity, like with revealing clothing or sexually suggestive positions, got more engagement on average before we changed the distribution curve to discourage this. The same goes for posts that don't come within our definition of hate speech but are still offensive."
As noted, these posts don't break the rules, but they come close, and Facebook is now seeking to discourage this, rather than the current process which boosts it.
It's an interesting idea - one of the biggest problems in regards to social media platforms being used to disseminate misinformation is that sensationalist content simply works. It generates more discussion, more comments, and inspires more shares, both from people who strongly agree and strongly disagree with the material. Triggering an emotional response is key to viral sharing, and nothing does so better than a controversial opinion.
That, in my opinion, is what has skewed news coverage more broadly - in times past, news publishers didn't have detailed insights into which specific articles within their publications were getting more attention. But now, in the age of digital distribution, where everything can be measured in clicks and comments, everyone knows exactly what elicits the most response. And since more clicks equals more revenue, the system is essentially geared towards sensationalism.
That, in turn, has fueled the 'fake news' epidemic. Here's an example - this post recently saw huge distribution on Facebook, with thousands of users expressing their views (mostly anger) on what the post promotes as a religious-based attack.

But that description is totally untrue - this is not a video of a Muslim refugee ruining a religious statue in Italy, it's an incident that happened in Algeria late last year. The man attacked the statue on the Ain El Fouara fountain because it depicts a naked woman, which he believes is indecent - the same statue has been vandalized several times for the same reason, as Algeria is a majority Muslim nation, and many see the depiction as distasteful.
But as you can see from the view count, the reality doesn't matter. Reporting the facts of the incident wouldn't trigger such response, but taking a more sensationalist - and in this case, untrue - perspective has boosted the post to viral levels.
And you know what else happens? The people who see this question why they aren't seeing it in mainstream news coverage.
Their verdict? We're being sheltered from the truth. The reality? It never happened as described.
That's not to say this is what mainstream news outlets are doing, but it illustrates how both fake news and sensationalist content spreads, a problem that's very hard to stop. In this case, the post does actually violate Facebook's misinformation rules, but often, as Zuckerberg notes, it's not so clear-cut. Which is why this new measure is so important, and could have a major impact.
In addition to this, Zuckerberg also notes that Facebook's looking to establish a new independent committee to handle appeals over content rulings.
"In the next year, we're planning to create a new way for people to appeal content decisions to an independent body, whose decisions would be transparent and binding. The purpose of this body would be to uphold the principle of giving people a voice while also recognizing the reality of keeping people safe."
This seeks to maintain Facebook's content standards, and restrict misinformation and the like, without Facebook having to move into the area of editorial control. Facebook can then make rulings, but pass responsibility for their enforcement onto a third party as necessary, lessening their judgment burden.
It's a clever move to avoid editorial control - which Facebook has been working to avoid. Both measures are interesting in their own way, and it'll be interesting to see whether they actually have the desired effect, and whether the spread of misinformation on the platform does, indeed, decrease.
You can read Zuckerberg's full post here.