Facebook Outlines Enhanced Efforts to Remove Child Exploitation Content from its Platform
As part of its ongoing efforts to improve its user security and protection measures, Facebook has this week outlined the advances it's made in detecting and removing content which exploits children, an area that requires specific and dedicated focus.
Using improved artificial intelligence tools, Facebook has been able to significantly improve its performance in this regard - as explained by in the update post:
"In the last quarter alone, we removed 8.7 million pieces of content on Facebook that violated our child nudity or sexual exploitation of children policies, 99% of which was removed before anyone reported it."
That's a huge endorsement for Facebook's ongoing AI and image recognition capacity - the fact that Facebook has been able to shield users from such content completely is a major effort, which saves both users and Facebook content reviewers from the trauma of such exposure.
And the latter is a real issue - a recent documentary outlined how Facebook content moderators working in Manila are forced to deal with significant psychological impacts stemming from exposure to such content, saving everyday users from having to face the same. The more Facebook can do to address this - for employees and users - the better, while the figures also underline how far Facebook's image recognition and detection tools have now evolved.
Facebook has also outlined its ongoing relationship with the National Center for Missing and Exploited Children (NCMEC) to police such behavior when detected.
"We have specially trained teams with backgrounds in law enforcement, online safety, analytics, and forensic investigations, which review content and report findings to NCMEC. In turn, NCMEC works with law enforcement agencies around the world to help victims, and we’re helping the organization develop new software to help prioritize the reports it shares with law enforcement in order to address the most serious cases first."
Facebook also notes that it works with a range of safety experts, NGOs and companies "to disrupt and prevent the sexual exploitation of children across online technologies".
This is a major, critical element for all digital platforms to address. The sharing of such content puts the most vulnerable of our society at risk, and its important that such is recognized, and that all efforts are made to stop it.
Facebook's advances here are important, and should help improve the same across the board. While the company has come under fire for privacy breaches and errors in recent months, we should also recognize initiatives like this, and acknowledge these significant developments.
Follow Andrew Hutchinson on Twitter