In yet another reminder of the power of social networks to fuel extremist movements, Facebook has today announced that it's removing a cluster of accounts and groups linked to the violent 'boogaloo' movement in the US.
As explained by Facebook:
"Today we are designating a violent US-based anti-government network as a dangerous organization and banning it from our platform. This network uses the term 'boogaloo' but is distinct from the broader and loosely-affiliated 'boogaloo' movement because it actively seeks to commit violence. For months, we have removed 'boogaloo' content when there is a clear connection to violence or a credible threat to public safety, and today’s designation will mean we remove more content going forward, including Facebook Groups and Pages.
The 'boogaloo' movement is a right-wing group that has been linked to various violent crimes, and has seen increased activity in recent months.
As explained by The Verge's Casey Newton:
"The roughly seven-year-old boogaloo movement represents a loosely knit group of right-wing extremists, some of whom advocate for a second Civil War. Its name derives from the camp classic breakdancing movie Breakin’ 2: Electric Boogaloo; “electric boogaloo” has become an ironic way of referring to sequels."
The content shared within these networks takes on a satirical, even comical tone, but certain elements have been moved to action based on these communications.
Again, from Facebook:
"It is actively promoting violence against civilians, law enforcement and government officials and institutions. Members of this network seek to recruit others within the broader 'boogaloo' movement, sharing the same content online and adopting the same offline appearance as others in the movement to do so."
In this initial action against the boogaloo movement, Facebook is removing 220 Facebook accounts, 95 Instagram accounts, 28 Pages and 106 groups directly tied to the group.
"We have also removed over 400 additional groups and over 100 other Pages for violating our Dangerous Individuals and Organizations policy as they hosted similar content as the violent network we disrupted but were maintained by accounts outside of it."
Facebook notes that this is the first time is has taken broad action against content within the boogaloo movement - while it has always removed content where there's a clear call for violence, it hasn't felt the need to make a more cohesive push against the group due to the nature of these posts.
That, as noted, has changed more recently:
"We removed over 800 posts for violating our Violence and Incitement policy over the last two months and limited the distribution of Pages and groups referencing the movement by removing them from the recommendations we show people on Facebook."
Recent civil unrest in the US, inflamed by tensions around the COVID-19 lockdowns and the #BlackLivesMatter movement, seem to have sparked an escalation in boogaloo rhetoric, which has prompted Facebook to take action now to reduce its potential influence.
That's a positive development, in that Facebook is looking to take a more proactive role in detecting and removing such before it gets more out of hand. It's also another step towards greater censorship and content moderation from The Social Network, which has infamously refused to take similar action on hate speech or violent rhetoric from other sources.
Could this be an indicator that Facebook is shifting its thinking on such, and looking to address such content in a more definitive way?
Could that also have been sparked by the #StopHateforProfit campaign, which has now seen a number of big-name advertisers pledge to pause their Facebook ad spend over its failure to act on such?
Either way, the outcome is that there will be less inflammatory and dangerous content on Facebook, which is a good result. Now to see if other platforms follow suit. Reddit, for example, has this week implemented new rules to address hate speech and threats, which lead to the removal of 2000 subreddits.
Will Reddit now also make a similar push against boogaloo related material?
It seems like an area that should be addressed, though it'll no doubt once again spark more criticism of Facebook for seeking to silence certain voices.