Facebook has outlined its evolving efforts to detect and remove 'non-consensual intimate images', or revenge porn, in its apps, utilizing advanced machine learning to better identify such content faster, in order to protect the victims.
As per Facebook:
"By using machine learning and artificial intelligence, we can now proactively detect near-nude images or videos that are shared without permission on Facebook and Instagram. This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution so they are reluctant to report the content themselves, or they are unaware the content has been shared."
Expanding on this, Facebook is now broadening the pilot program it launched in 2017 which enables users to submit a photo to Facebook, from which Facebook's team can create a "digital fingerprint" of the image and stop it from ever being shared.
The program was met with some controversy when it was first tested with Australian users, with concerns that Facebook was asking for people to upload naked images in order to stop the spread of the same.
Facebook says it has largely received positive feedback from victims and support organizations in regards to the option, which will see it expand this pilot over the coming months, "so that more people can benefit from this option in an emergency".
Facebook's also launching a new element in its Safety Center which will be solely dedicated to providing support for revenge porn victims.
"Here victims can find organizations and resources to support them, including steps they can take to remove the content from our platform and prevent it from being shared further - and they can access our pilot program. We’re also going to make it easier and more intuitive for victims to report when their intimate images were shared on Facebook. And over the coming months, we’ll build a victim support toolkit to give people around the world more information with locally and culturally relevant support."
Revenge porn has become a significant issue in recent times - according to research, around one in 25 Americans are either threatened with or become victims of nonconsensual image sharing, equating to around 10 million Americans each year. And the impacts can be significant, a study by the Cyber Civil Rights Initiative found that 51% of US revenge-porn victims have contemplated suicide as a result of the incident.
Given this, it's important for social platforms to provide ways to help and support victims. Facebook's image-identification tools are a big step forward in this regard, and will hopefully lessen the impact, and provide some level of assurance to those in need.