At Facebook's size, content moderation is always going to pose a challenge. While the company continues to improve its machine learning and artificial intelligence tools to help filter out offensive material before anyone has to actually see it, there are many cases where this is simply not possible, and human review is required.
The problem with that is that continued exposure to such material can have significant mental health impacts, which The Verge's Casey Newton outlined in his harrowing piece looking at what Facebook's moderators experience.
From Newton's report:
"Collectively, the employees described a workplace that is perpetually teetering on the brink of chaos. It is an environment where workers cope by telling dark jokes about committing suicide, then smoke weed during breaks to numb their emotions. It’s a place where employees can be fired for making just a few errors a week — and where those who remain live in fear of the former colleagues who return seeking vengeance."
The toughest part here is that somebody has to do it - in order to keep Facebook as civil and safe as possible for the platform's billions of users, a level of human moderation is required, Facebook needs this process to avoid much broader exposure and concern.
But, as Newton notes in his piece, Facebook could improve the conditions for workers who are left to deal with such horrors, and their ongoing impacts after work hours.
Facebook has taken note - this week, The Social Network has announced a raft of new changes to better care for its moderation staff, including pay increases, additional training and support, and new regulations on limiting overtime, among other measures.
As explained by Facebook:
"Content review at our size can be challenging and we know we have more work to do. We’re committed to supporting our content reviewers in a way that puts their well-being first and we will continue to share steps forward on this important topic."
It's a difficult area for Facebook, and indeed for all social media platforms. The ideal solution, which Facebook is working on, would be to develop more advanced AI and machine learning tools to detect such content, but the likelihood of such systems ever being able to filter out everything is not very high. In that case, Facebook needs to do all it can to care for the workers involved, and given the descriptions of the current state of such workforces, there's still a way to go on this front.
And while it is good to see Facebook improving its conditions for moderators, there is one concerning note within Facebook's announcement.
"Today we’re committing to pay everyone who does contract work at Facebook in the US a wage that’s more reflective of local costs of living."
The concern here is that Facebook has moderators working all over the world, with contract staff filtering out offensive content in different regions, who won't, based on this announcement, see any increase in their salary. They will be eligible for some of the other newly announced improvements, including limits on overtime, but the majority of the new measures relate specifically to US employees. Hopefully, those same initiatives are also being driven with contractors in other regions, who are obviously subject to the same concerning conditions.
As noted, it's a very difficult area, and it is good to see Facebook looking to take more action on this front. And while you may not realize it, you may not have any exposure to what it is these moderation teams do, you most definitely benefit from their work. From a general user perspective, the thing to keep in mind is that when you report a post that you don't like, and you don't get a timely response from Facebook, consider that the moderation teams are working on the most extreme cases - and as such, your reports may not get priority.
That's a problem for Facebook to resolve, but at 2.38 billion users, it's reasonable to expect that a level of prioritization is required, and if you just don't like the look of someone's post, but it's not overly offensive, it may not get actioned right away.