Facebook has a lot of work ahead of it to clean up its platform.
As an example, The Social Network has already removed more than 3,500 Facebook and Instagram Pages, accounts and groups for "coordinated inauthentic behavior" in the first two and bit months of this year alone - the most recent of which being removals in the UK and Romania this week.
This is part of Facebook's expanded effort to boost, primarily, election security, and ensure political groups are not working to manipulate voters through Facebook. But the spread of misinformation online obviously spreads far beyond propaganda and election campaigning alone, and could be causing even larger societal damage by facilitating other fringe movements.
One of the most concerning of these is the rise of "anti-vax" sentiment, with more and more groups forming to promote dangerous messages about the risks of vaccines, and conspiracies behind why people are being "forced" to vaccinate their children, despite such concerns.

An example of an anti-vax ad (via The Guardian)
The problem is, there is little to no scientific evidence to prove these sentiments - but that hasn't stopped them from spreading. According to the World Health Organization, Europe saw a record number of measles cases in 2018, due, at least in part, to a growing number of parents who are refusing vaccinations for their children, while in America - where measles was officially declared eliminated in 2000 - reports of outbreaks are, once again, becoming common.
This is not a trifling concern that can be argued in internet memes, this is a major health risk to millions, if not billions, of people. And as such, Facebook has this week announced a new set of measures to stop the spread of anti-vax content, following the lead of Pinterest, which recently announced its own steps to halt distribution of the same.
Facebook has announced that it will undertake the following steps:
- We will reduce the ranking of groups and Pages that spread misinformation about vaccinations in News Feed and Search. These groups and Pages will not be included in recommendations or in predictions when you type into Search.
- When we find ads that include misinformation about vaccinations, we will reject them. We've also removed related targeting options, like “vaccine controversies.” For ad accounts that continue to violate our policies, we may take further action, such as disabling the ad account.
- We won’t show or recommend content that contains misinformation about vaccinations on Instagram Explore or hashtag pages.
- We are exploring ways to share educational information about vaccines when people come across misinformation on this topic.
Facebook isn't banning such content outright - Pinterest, by comparison, will now delete anti-vax content it detects on its platform - but it is looking to limit its reach, and reduce its spread throughout its network.
And as any publisher can tell you, Facebook's capacity to limit reach is significant. Ideally, Facebook would take an even tougher stance on such content, but it does show that Facebook is becoming more willing to take action against such material, as opposed to the 'hands off' approach its leaned on in the past.
As noted, the spread of misinformation in a political context is hugely concerning, but it can become literally deadly when it relates to health impacts such as this. Hopefully, this is the beginning of stronger action from Facebook on similar concerns and movements.