After a recent report found that Facebook is hosting thousands of groups and Pages, with millions of members and followers, that support the dangerous QAnon conspiracy theory, Facebook has today announced that its taking action, by removing or restricting a broad swathe of linked entities, and updating its rules to crack down on QAnon content.
As per Facebook:
"Today we are taking action against Facebook Pages, Groups and Instagram accounts tied to offline anarchist groups that support violent acts amidst protests, US-based militia organizations and QAnon. [We're also] expanding our Dangerous Individuals and Organizations policy to address organizations and movements that have demonstrated significant risks to public safety, but do not meet the rigorous criteria to be designated as a dangerous organization and banned from having any presence on our platform."
And as noted, the change will impact a lot of accounts - Facebook says that:
- Over 790 groups have been terminated
- 100 Pages have been banned
- 1,500 ads tied to QAnon have been blocked
- 300 related hashtags across Facebook and Instagram have also been blocked
- Over 1,950 Groups, and 440 Pages on Facebook, have been restricted, as have more than 10,000 related Instagram accounts
The announcement is significant, given Facebook's tacit facilitation of such movements over time.
As noted by NBC News:
"Facebook has been key to QAnon's growth, in large part due to the platform's Groups feature, which has also seen a significant uptick in use since the social network began emphasizing it in 2017."
QAnon has gained a fervent following online for purportedly sharing secret insights about the Trump adminstration's ongoing battle against the 'deep state', a collection of elite business people and celebrities who, the group says, are secretly controlling the world.
The group has become more problematic in recent times due to links to various violent incidents and threats, which lead to the FBI designating QAnon as a potential domestic terrorist threat in August last year.
Facebook's decision to crack down on QAnon content follows Twitter's announcement of the same last month, while other platforms like Reddit have also taken action against the group.
The changes also come as Facebook comes under more scrutiny over its role in facilitating the spread of hate speech online. Last month, civil rights activists lead a boycott of Facebook ads in protest over the company's decision not to remove hate speech posted by US President Donald Trump.
And with a divisive US Election looming, Facebook needs to ensure that it's doing all it can to reduce tensions where possible, especially with respect to credible threats.
But it's not a full elimination of QAnon and related discussion.
"While we will allow people to post content that supports these movements and groups, so long as they do not otherwise violate our content policies, we will restrict their ability to organize on our platform."
It seems like Facebook should take a harder line here, but it also, as always, wants to avoid being the referee of what's acceptable and what's not, which is why it's looking to allow some discussion to continue.
Facebook says that it will continue to improve its detection and enforcement efforts by 'studying specific terminology and symbolism used by supporters' in order to identify key language and markers in such posts.
It's not perfect, but it is an important step for Facebook to take, especially given the platform's massive reach and potential to amplify dangerous messaging.