Three weeks ago, a group representing hundreds of Reddit moderators, who lead some of the platform's most popular communities, published an open letter calling for Reddit to take more action against racism and hate speech.
As per the letter:
"The problem of Reddit’s leadership supporting and providing a platform for racist users and hateful communities has long been an issue. Nearly six years ago, dozens of subreddits signed the original open letter to the Reddit admins calling for action. While the Reddit admins acknowledged the letter and said it was a high priority to address this issue, extremely little has been done in the intervening years."
Reddit CEO Steve Huffman responded, saying that the platform would answer with actions, rather than words. And today, Reddit has outlined a range of updates to its content policy, which will more explicitly address hate speech and offensive behavior - and will see more than 2,000 subreddits immediately banned as a result.
As per Reddit:
"Ultimately, it’s our responsibility to support our communities by taking stronger action against those who try to weaponize parts of Reddit against other people. [...] It starts with dealing squarely with the hate we can mitigate today through our policies and enforcement."
The main focus of Reddit's new policy approach is 'the human' and remembering the impacts of what you say on the platform.
"Remember the human. Reddit is a place for creating community and belonging, not for attacking marginalized or vulnerable groups of people. Everyone has a right to use Reddit free of harassment, bullying, and threats of violence. Communities and people that incite violence or that promote hate based on identity or vulnerability will be banned."
Reddit says that most of the subreddits that will banned as a result are already inactive, but one in particular, r/The_Donald, is a popular, and highly active network for supporters of US President Donald Trump.
The subreddit has been problematic for some time, with Reddit noting that it has:
"...consistently hosted and upvoted more rule-breaking content than average (Rule 1), antagonized us and other communities (Rules 2 and 8), and its mods have refused to meet our most basic expectations. Until now, we’ve worked in good faith to help them preserve the community as a space for its users—through warnings, mod changes, quarantining, and more."
The removal of r/The_Donald will no doubt add more fuel to the anger of conservatives, who feel that social platforms are increasingly trying to silence them through such policy updates.
Indeed, at the same time, Twitch has also banned President Trump from its platform over “hateful conduct” that was aired on a Twitch stream. At the same time, Twitter continues to add warning pop-ups to the President's tweets, while Facebook is facing a growing advertiser boycott over its inaction on the same.
Reddit and Twitch are likely seen as lesser platforms for distribution, and thus, their moves on this front won't be as scrutinized. But Reddit, in particular, is more influential than most people would think, with many memes and trends originating from the increasingly popular sharing network.
As such, the removal of r/The_Donald could have a big impact. It won't change the Trump campaign's overall approach to social media, with the majority of its focus on Facebook. But it could slow the momentum of related movements, and lessen the distribution of divisive content.
Of course, many of these users will just switch to another, more tolerant platform, where they'll be free to share their unfiltered views. But the freedom those platforms provide could actually be more limiting for on-sharing to other platforms, as more offensive posts will simply be removed on alternate platforms. In this respect, the message distribution dynamic could be disrupted in a way that has broader implications.
Ultimately, with 430 million monthly active users, Reddit likely provided a bigger platform for such commentary than many might think, and as such, the flow-on effect could be more impactful in the broader scheme.
The changes also align with Reddit's more advertiser-friendly approach - over the last few years, Reddit has been looking to expand its ad tools, and maximize its revenue potential, which has involved shifting its free speech ethos in order to counter brand perceptions of the platform.
These new rules will add to that approach, and could help to make Reddit a more mainstream, and enticing, ad option.
Overall, the policy decisions make sense. And while it does mean that Reddit will need to step further away from its 'free for all' approach of the past, it should provide more protection for vulnerable users, and create a healthier eco-system for Reddit overall.
But it does, once again, underline the challenges in allowing free speech.
The great promise of all social media platforms is that they would be open spaces, where the people would decide what goes; where anyone can have a platform to say whatever's on their mind, at any time. But as we've found, there needs to be limits on such.
Many critics of rule changes like this incorrectly quote the First Amendment, which relates to Government rule over the restriction of free speech. But such changes are not defined by Government, they're defined by the platforms themselves, and as such, they can decide what goes on their platforms, what's allowed and what is not.
At some point, at a certain size or level of usage, the platforms need to come to terms with such impacts, and define what they're comfortable being responsible for, or not, on their networks. If they move to limit such, that's completely within their rights - and most would agree that a level of restriction on what can be shared, and distributed, is needed.
Where you personally draw that line is up to you, but the platforms are the ones that need to make the call on their own thresholds for such.