Facebook is once again facing a wave of condemnation after the publication of a new series of internal reports – called ‘The Facebook Papers’ – which were recently leaked to media outlets.
An expansion of the initial ‘Facebook Files’ which former Facebook engineer France Haugen released to The Wall Street Journal, The Facebook Papers include more internal insights, including research reports, employee commentary, insights on experiments, and more.
Among the various claims stemming from the new disclosure:
- Facebook executives have regularly prioritized growth over safety, despite evidence of significant risks
- Facebook struggled to contain dangerous claims that lead to the Capitol Riot on January 6th, with various employees suggesting that they should have been better prepared for such, but management often delayed or avoided decisions
- Facebook lacks local language support in many regions, which has allowed inflammatory language to flourish on the platform
- Facebook has been unable to stop the use of its platforms for human trafficking, an issue that almost caused its apps to be removed from The App Store in 2019
- Various Facebook employees voiced their concerns that the platform was doing harm, and has done little to improve, despite clear evidence of such.
Given Facebook’s scale and influence, the impact of each of these elements is huge, and could have major effects, in a range of ways, from fueling dangerous movements to causing political unrest.
Publicly, Facebook has repeatedly played down such. For example, earlier this month, in response to the initial Facebook Files leak, the company’s vice president of policy and global affairs, Nick Clegg said that the suggestion that Facebook had played a key role in contributing to post-election protests at the Capitol building was “ludicrous”.
Clegg’s view is that Facebook is only one small part of broader societal shifts, and that it can’t be the core problem that’s lead to such major conflict, in various regions. But these new documents show that, at the least, Facebook is well aware that it could be contributing to these elements, and that it’s either struggled to find a constructive way forward in addressing each, or more concerningly, that it’s been unwilling to take action for fear that such changes may stifle user growth, and eventually hurt the company’s bottom line.
Which is also a key note of concern highlighted in the documents. Another internal disclosure shows that Facebook has been gradually losing touch with younger users, with time spent for US teens on Facebook down 16% year-over-year since 2019.
The data also shows that number of new teen signups on Facebook is declining – which is not a major revelation in itself, as most people probably have a sense that Facebook’s lost some of its allure among younger user groups. But when considered on balance with the additional insights, and in assessing the factors that could influence Facebook’s decisions, it paints a concerning image of how the company could take action, or avoid such, with growth concerns in mind.
There are also insights into Facebook’s more extreme experiments to address such issues as well, with one report explaining how Facebook ran a test to hide the Like button on posts in order to reduce user stress and anxiety. That didn’t work, while it also reduced user sharing – but the experiment, in itself, points to social media mechanics like Likes potentially being another cause for concern, to the point that Facebook attempted to remove the functionality altogether.
The latest revelations are almost certain to see Facebook grilled in congress once again, while they’ll also spark a new round of calls for external regulation, which could lead to major shifts in the social media landscape.
Should Facebook be regulated by an official body? Facebook itself supports some forms of regulation, but it’ll be interesting to see whether this new disclosure leads to alternate forms of regulatory procedure, which go beyond what Facebook would prefer.
The new reports will also put more pressure on Facebook CEO Mark Zuckerberg, who looms over each of the key decisions, and maintains clear control over the direction of the company. Given the platform’s scale and impact, should Zuckerberg remain in control of such a powerful force, in many aspects?
These are the conversations that we’re now set to see, as people seek solutions to the various problems outlined in the latest leak, which could, again, change social media entirely, or cause a significant shift in the power balance at Facebook.
But the bottom line, really, is ‘should a privately-owned company have that much power and influence over societal shifts?’
The insights show that Facebook does indeed have major influence, but could it do more, and what factors play into its decision making in this respect?