Facebook has announced its latest measure to combat the spread of fake and misleading news through its network, with the addition of a new related articles listing that will appear before you click on an article in your News Feed.
As you can see, when a user has shared a post about an unnamed 'medical advancement', Facebook has provided a set of links beneath that article, which could help to expose readers to alternate perspectives on the topic.
So, for example, if the post were about how new research shows smoking is actually good for you, Facebook could counter that with reports underlining that that's not actually true. And if there are three reports immediately below contradicting it, and they're all from reputable sources, maybe the user will think twice about the validity of the original story.
Facebook says the new related links will appear on topics 'many people are talking about on Facebook' and may include content from third-party fact-checkers. This could give Facebook an easy way to immediately dispel rumors which are gaining traction - while they wouldn't need to add these pre-emptive related stories links to every post, if Facebook's editorial team identified a false report which was gaining traction, they could quickly tag it to have these qualifying insights linked to it.
And if the option gets rolled out more widely, it could eventually become an immediate red flag. You see a story with pre-emptive links trailing down from it, you'll come to know that it's probably not true, which might help stop people from firing off an impulse reaction in the comments (and thus, boosting the original posts' reach).
It's the latest step in Facebook's quest to better inform users and ensure misleading content is minimized, a process that will take some time, and various efforts, to enact.
As part of his recent update on the News Feed algorithm and how it works at F8, Facebook News Feed VP Adam Mosseri also outlined their key areas of focus for the system moving forward, with 'integrity' being one of their three focal points, and how Facebook can better inform communities through the information they present.
Or, as Mosseri puts it:
"How can we better nurture the good and address the bad?"
But combating fake news is tough. One of the key problems with Facebook is that it's built to foster viral sharing - the more people who engage with something, the more people see it - which, in some respects, incentivizes publishers to create content that inspires debate. If you can get a hundred people debating the merits of a post in the comments, your reach will be massive. Because of this, it makes more sense to publish a headline like 'New Report: People hate cats' than it does a more measured one like : 'New survey shows more people like dogs than cats'. The former inspires more emotional response, which is key to Facebook reach.
This is partly what's fuelled the current situation with news bias and discussion of fake news - in the current online news eco-system it makes more sense to take a stance one way or another, to report a divisive opinion, than it does to be more measured in your approach. Clickbait and sensationalism is essentially rewarded - and when you also consider that up to 80% of people only ever read the headline of a post, you get a better understanding of why such debate occurs. Often, the content itself is not as relevant as the discussion it inspires.
This is why Facebook's battle against fake news is complex - their network model is built around a system which shows people more content that supports their existing world view, and enables them to eliminate alternate perspectives.
Finding ways to break into those filter bubbles is the way forward, and smaller measures like this new related articles change could play a significant part in shifting the dynamic.
In terms of business use, it adds another reason to ensure you're double-checking the details of anything you share before you post it. If the new tool does get rolled out, you don't want your links to show up with qualifying stories, as it could quickly erode your reputation with your audience.