Here's a Devil's Advocate type question - could Facebook's big push on encouraging more group usage have an underlying motive in shifting controversial discussion out of the public eye?
That question stems from a new report in The Guardian, which suggests that The Social Network is not doing enough to curb the growth of private anti-vaccination groups, a concerning movement which, in some regions, has seen long-controlled diseases start to re-occur.
As per The Guardian:
"So-called “anti-vaxxers” are operating on Facebook in closed groups, where members have to be approved in advance. By barring access to others, they are able to serve undiluted misinformation without challenge."
This is a side concern of the broader groups shift - as per Facebook, more than a billion people engage within Facebook groups each month, where they're able to interact with like-minded people in an enclosed space, as opposed to sharing their thoughts in public. That growth in groups usage makes sense in terms of broader shifts in digital interaction (the top four messaging apps now have more users than the top four social platforms), with users becoming more wary about what they post for public scrutiny - but it does also allow for these more controversial, concerning movements to grow.
But could that actually be beneficial for Facebook itself?
Facebook's big groups push first started in early 2017, shortly after the Cambridge Analytica controversy reached a peak, and the talk of Russian interference was gaining momentum - where fingers were being pointed at Facebook as a key concern, a key tool in the spread of misinformation and fueling societal division. Given this, it might make sense for Zuck and Co. to be keen to encourage groups as safe spaces, where people can share their opinions, no matter how controversial, with like-minded folk. Free of judgement, allowed to grow. Allowed to fester, out of the public eye.
The benefits here could be two-fold - for one, it takes some of the onus off Facebook to censor the same, because the only people seeing such content are those who agree with it (and thus, they're less likely to report it). It would also remove at least some of this material from the public eye, lessening scrutiny on the platform more broadly. It's still there, those conversations are still happening, but because regular users aren't being exposed, the problem lessens.
The logic of both elements makes sense, and adds another spoke to Facebook's broader groups push - but it also means, as noted by The Guardian, that these movements are still active, they're still growing and building momentum on the platform.
"The groups are large and sophisticated. Stop Mandatory Vaccination has more than 150,000 approved members. Vitamin C Against Vaccine Damage claims that large doses of the vitamin can “heal” people from vaccine damage, even though vaccines are safe."
Consider too, in this specific example, that the World Health Organization lists "reluctance to vaccinate" as one of the top 10 global health threats in 2019, while the WHO also notes that there's been a 30% worldwide increase in cases of the measles, "with some countries that were close to eliminating the disease now seeing a resurgence".
Facebook, now at 2.3 billion monthly active users, is arguably the most influential platform in history, with huge capacity to shift people's thoughts and actions through what they see, and are shown on the network. With that influence comes the responsibility to manage how it's used - and while Facebook has long sought to distance itself from editorial oversight, or overt censorship, there is a question over the role it plays in enabling movements like this to grow.
Should Facebook be doing more to take action against such groups? Should Facebook be held responsible for the actions of adults who use its platform, who have the freedom to think and discuss what they like?
It's a difficult balance, and one that Facebook will always struggle to maintain, but it does add another, less wholesome element to The Social Network's wider groups push.