Facebook has announced some new, harsher rules for group interactions, which will see the reach of some groups restricted, and some groups banned outright, due to repeated rule violations on the platform.
First off, on political groups - back in September, in the lead-up to the US Election, Facebook announced that it was removing civic and political groups from its group recommendations in the US, along with newly created groups, in an effort to reduce the impact of divisive content and misinformation.
Facebook has maintained that action ever since, with the Capitol riots in January prompting a hold on the action.
Clearly, that's having some impact, because now, Facebook has opted to expand the removal of political and civic groups from all group recommendations around the world.
As per Facebook:
"While people can still invite friends to these groups or search for them, we have now started to expand these restrictions globally. This builds on restrictions we’ve made to recommendations, like removing health groups from these surfaces, as well as groups that repeatedly share misinformation."
That's in line with Facebook's broader effort to reduce political content in News Feeds. Last month, Facebook CEO Mark Zuckerberg said that a common note of feedback the company is now seeing is that "people don't want politics and fighting to take over their experience on our services". The political division over the past four years appears to have taken a toll, and it seems that Facebook users at least have had enough, which has prompted Facebook to re-think how much politics is displayed in user feeds.
This move to reduce political groups is another step on this front, and could go some way towards reducing tension on the platform.
In addition to this Facebook's also looking to reduce the reach of non-political groups which violate platform rules:
"When a group starts to violate our rules, we will now start showing them lower in recommendations, which means it’s less likely that people will discover them. This is similar to our approach in News Feed, where we show lower quality posts further down, so fewer people see them."
That could prompt more group admins to keep a closer eye on their groups, and add more impetus to address potential concerns. And when you also add in the fact that Facebook has recently started adding new tools to enable the monetization of groups, there's now a financial incentive for admins to better enforce group rules, and keep the discussions on track.
In addition, Facebook says that its new group restrictions will become increasingly severe as they accrue, until groups are removed completely for repeated issues.
"And when necessary in cases of severe harm, we will outright remove groups and people without these steps in between."
Again, Facebook's putting the onus on group admins to take more responsibility, with full bans now a possibility due to failure to act.
Facebook's also adding new warning prompts on groups that have previously been tagged with Community Standards violations, so that people can make a more informed decision before joining.
"We'll limit invite notifications for these groups, so people are less likely to join. For existing members, we’ll reduce the distribution of that group’s content so that it’s shown lower in News Feed. We think these measures as a whole, along with demoting groups in recommendations, will make it harder to discover and engage with groups that break our rules."
Facebook will also now require admins and moderators to temporarily approve all posts when a group has a substantial number of members that have violated its policies, while those who register repeated violations in groups will be blocked them from being able to post or comment for a period of time in any group.
The penalties are significant, and set clear, new ground rules for group activity - which, given the stakes, is a positive step.
As engagement in Facebook News Feeds has declined, Facebook has instead shifted its focus onto groups, and keeping its 2.7 billion users optimally engaged by highlighting relevant topics, and communities for people to join. That ensures Facebook can maintain high levels of engagement, while it also moves some of the more problematic discussion out of people's main feed, which provides the added benefit of not turning off regular users.
Last year, amid the COVID-19 pandemic, groups became a key connective tool - with regular social interactions off the cards, more and more users gravitated towards groups of interests to stay in touch with the outside world. Over 1.8 billion now people use Facebook groups every month, but that's also lead to some more problematic elements.
One of the main issues with Facebook groups has been the proliferation of hate speech, and dangerous movements like QAnon and The Proud Boys, which have organized rallies and events via their Facebook communities. Some participants in these groups have gone on to commit real-world harm, while the Capitol riots earlier this year were seen as the culmination of these dangerous communities banding together, at least partially via Facebook's platforms.
Seeing online chatter evolve into an actual insurrection attempt has prompted much discussion about how to address such concerns, with the overall consensus being that early action is needed to stem the rise of such communities, rather than dismissing them as harmless talk.
That's what's inspired these new rules from Facebook, and given what we now know about where such engagement can lead, these are important, valuable steps, that could have a big impact.
The new rules won't solve all the problems, and Facebook still has a way to go in ensuring group engagement doesn't lead to real world incidents (worth noting, too, that similar concerns relate to Facebook's plan to make all of its messaging apps fully encrypted, essentially hiding such discussion from all enforcement). But these are some positive steps, and it's good to see Facebook taking stronger action to enforce problematic interactions.