There's no getting around it - Facebook, with its 2.3 billion active users, has grown to become the most wide-reaching, influential platform on the internet, and maybe the most significant, single driver of human interaction of all time.
Sure, other platforms like Google and YouTube may see comparative traffic numbers, but neither of them facilitate the intimacy of personal communication like Facebook, and that is hugely powerful. People trust the information and opinions shared by friends and family more than anything else, and The Social Network provides the framework for such, at scale, which gives Zuck and Co. immense power and oversight - worryingly so, many have argued. That power also comes with a huge level of responsibility, which, since the 2016 US Presidential Election, has been made abundantly clear.
Now, even Mark Zuckerberg himself is calling for help in managing this, but Facebook also knows that regulatory process takes time. So it's taking its own action - this week, Facebook has announced a raft of new rule changes and tools which are designed to further limit the use of its platform for sharing misinformation and potentially harmful content.
They won't solve everything - they won't completely stop the sharing of divisive, offensive content on Facebook - your ill-informed uncle will still be able to spout conspiracy theories based on widely selective information. But these steps are significant - maybe Facebook's most significant and potentially meaningful actions on this front to date.
Here's what's been announced.
Cracking Down on Group Sharing
Over the past two years, Facebook has been encouraging more use of groups, catering to the rising amount of conversations shifting out of the main News Feed and into more private spaces.
This trend has been most noticeable in the rise of messaging use - with consumers becoming more wary of the public, permanently recorded nature of main feed sharing, and the potential impacts this can have on, say, your future career prospects, a growing number of people have sought alternative, enclosed spaces to chat and share, without fear of the same potential recourse.
Cynics have also suggested Facebook's groups push may be about moving concerning content out of the public eye - if Facebook can't stop people sharing offensive, divisive content, at least moving it into private groups makes it less visible, and lessens scrutiny on the platform. That means that such content is still being shared, it's still fueling concerning movements, but as we're less exposed, it seems like less of an issue. Essentially, the suggestion is that Facebook has, in some ways, seen this as an opportunity to brush such concerns under the carpet and continue on, like nothing's happening.
But Facebook is taking action - as part of its evolving detection measures, Facebook says that it can already "proactively detect many types of violating content posted in groups before anyone reports it" whether its posted in public, closed or secret groups.
Adding to this, Facebook's new regulations will see it implement penalties for groups which spread misinformation, even if such content doesn't necessarily violate Facebook's standards.
If any group is found to be consistently sharing links to questionable sites, as determined by Facebook's fact-checking partners, its reach will be downgraded, while Facebook will also factor in the actions of group moderators when assessing whether to penalize a group.
As explained by Facebook:
"As part of the Safe Communities Initiative, we will be holding the admins of Facebook Groups more accountable for Community Standards violations. Starting in the coming weeks, when reviewing a group to decide whether or not to take it down, we will look at admin and moderator content violations in that group, including member posts they have approved, as a stronger signal that the group violates our standards."
Lastly on groups, Facebook's also adding a new 'Group Quality' feature for admins, which will provide an overview of any content removed and/or flagged, as well as a section for false news found within the group. Facebook hopes to provide admins with a clearer view into Community Standards violations, in order to keep them on track, and out of Facebook jail - and ensure they're made aware of concerning material which is not acceptable on its platform.
These are significant and welcome changes for groups - and as noted, they do appear to directly address the concern that Facebook has been using its groups push to hide questionable content, as opposed to removing it.
Of course, we have to see how it works in action before we can start patting Facebook on the back - it still requires a lot of work by Facebook moderators, who already have a heap on their hands. But the rules are a good start, and may help reduce the toxicity of certain movements originating from the bowels of The Social Network.
Adding 'Click-Gap' as a News Feed Ranking Signal
This is a major update - amid the various criticisms of Facebook's system, one that's particularly hard to overlook is the fact that Facebook's algorithm essentially encourages publishers to use divisive, partisan headlines and angles in order to spark more debate, more comments and more reactions on the platform.
More audience response equals more reach in the News Feed algorithm - a story entitled "Man Shoots Neighbor After Dispute" for example won't get as much reach on Facebook as "Black Man Shoots White Neighbor After Dispute". The latter sparks extended debate, triggers more emotional response. The algorithm sharing mechanism not only supports this, it encourages it, which has lead to more mainstream publishers taking more divisive stances in their reporting, while it's also fueled disproportionate growth of smaller, niche conspiracy theorist publications and websites, which have gone on to become their own concerning movements.
The biggest example of this is the anti-vax movement - and while Facebook can't be solely held responsible for such, there's clearly been an increase in anti-vax content, stemming largely from fringe websites and sources.
That's where this new measure will provide significant benefit.
In order to better qualify the content being shared on its network, Facebook is adding a new News Feed factor called "Click-Gap".
As explained by Facebook:
"Click-Gap, relies on the web graph, a conceptual “map” of the internet in which domains with a lot of inbound and outbound links are at the center of the graph and domains with fewer inbound and outbound links are at the edges. Click-Gap looks for domains with a disproportionate number of outbound Facebook clicks compared to their place in the web graph. This can be a sign that the domain is succeeding on News Feed in a way that doesn’t reflect the authority they’ve built outside it and is producing low-quality content."
This is similar to how Google uses backlinks and page authority in its ranking to determine website trust and reputation - using Click Gap, Facebook will now be able to penalize the reach of links to websites which may be popular on Facebook, but are not seeing the same digital votes of confidence on the wider web.
As noted by Wired:
"Click-Gap could be bad news for fringe sites that optimize their content to go viral on Facebook. Some of the most popular stories on Facebook come not from mainstream sites that also get lots of traffic from search or directly, but rather from small domains specifically designed to appeal to Facebook’s algorithms."
Indeed, looking at the most shared fake news reports on Facebook in 2018, you can see that the vast majority of them did not originate from reputable sources.

Click-gap will seek to address this, and could have a significant impact, particularly given that it's built into the algorithm, so it doesn't rely on increased human intervention.
Fact-Checking, Instagram Reach and Messenger Verification
Facebook has also outlined a range of other, smaller measures to improve content sharing and user safety.
Addressing the noted concern of having too much work for too few moderators, Facebook is also kicking off a new program to find new methods to better implement third-party fact-checking and content reports.
"We need to find solutions that support original reporting, promote trusted information, complement our existing fact-checking programs and allow for people to express themselves freely - without having Facebook be the judge of what is true."
One of the suggested options is to have users play a larger role in such reporting - though that opens Facebook up to additional issues with misuse. Facebook CEO Mark Zuckerberg has already called for increased government regulation in such matters, so Facebook is not left as the sole arbiter of what's true, but there's still some way to go on this front.
Over on Instagram, Facebook has also started to restrict the reach of potentially offensive content with a view to implementing more control over what's shared there also.
"We have begun reducing the spread of posts that are inappropriate but do not go against Instagram’s Community Guidelines, limiting those types of posts from being recommended on our Explore and hashtag pages."
Given Instagram's growth - and the fact that it has largely remained outside of the scope of criticism leveled at Facebook for the same - it's important for The Social Network to address such concerns proactively to a limit potential impacts.
And Facebook is also rolling out new verification badges in Messenger to stop impersonation.
"This tool will help people avoid scammers that pretend to be high-profile people by providing a visible indicator of a verified account. Messenger continues to encourage use of the Report Impersonations tool, introduced last year, if someone believes they are interacting with a someone pretending to be a friend."
There's a heap to take in here - and while most, if not all, of these measures will have little to no impact on digital marketers, it's important to understand how these updates could change your content reach and performance, even, maybe, as a side effect.
On Click-Gap, for example, it's possible that there's a lot of websites which see better performance on Facebook than other sources, and have fewer cross-site links - and they could, potentially, see their reach impacted as a result. That's not the intention of the update, but as it's automated, it could further limit Facebook performance.
That's also not to say it will, but it's important to take note of major Facebook changes like these and consider their various impacts. Overall, these are positive, welcome moves, which should hopefully help Facebook cut down on the spread of junk. But only time will tell what the true impacts will be.