Instagram Will Now Remove Any Self-Harm Related Images as Part of New Safety Measures
As part of its ongoing efforts to protect vulnerable users, Instagram has announced a new set of regulations relating to images of self-harm, including images of cutting and of healed scars.
Instagram chief Adam Mosseri has announced four new measures, which expand on the platform's existing rules around similar content.
- We will not allow any graphic images of self-harm, such as cutting, on Instagram – even if it would previously have been allowed as admission. We have never allowed posts that promote or encourage suicide or self-harm, and will continue to remove it when reported.
- We will not show non-graphic, self-harm related content – such as healed scars – in search, hashtags and the explore tab, and we won’t be recommending it. We are not removing this type of content from Instagram entirely, as we don’t want want to stigmatize or isolate people who may be in distress and posting self-harm related content as a cry for help.
- We want to support people in their time of need – so we are also focused on getting more resources to people posting and searching for self-harm related content and directing them to organizations that can help.
- We’re continuing to consult with experts to find out what more we can do, this may include blurring any non-graphic self-harm related content with a sensitivity screen, so that images are not immediately visible.
The rules may have some expanded impacts on artistic works, but the relative impact, matched against the potential issues it may cause for those at risk, is incomparable, and it makes sense for Instagram to take a stronger stance on this front.
There may be some expanded impacts over the decision to restrict the exposure of healed scars, particularly as it may relate to people who incidentally show such within their images, but again, the impact is likely outweighed by the potential trauma it could cause, and limiting reach in this respect (note: such images will not be banned) makes sense.
It's also difficult to know what the actual impact of these changes will be until they are enforced. Instagram's machine learning detection systems are likely not advanced enough to pick out all scars in images - which would cause a lot of false positives either way - so you'd expect that there would be some level of reliance on user reports for such, and how Instagram rules on them is impossible to know until the process in practice.
For its part, Instagram acknowledges the complexity of this issue, and has pledged to continue working towards the best solution:
"Up until now, we’ve focused most of our approach on trying to help the individual who is sharing their experiences around self-harm. We have allowed content that shows contemplation or admission of self-harm because experts have told us it can help people get the support they need. But we need to do more to consider the effect of these images on other people who might see them."
Given the potential for social networks to influence mental state, particularly as it relates to depression, it's important that all platforms continue to investigate what they can do, and seek out better solutions to protect vulnerable users.
Indeed, Instagram, according to some reports, is the worst social network in this respect - a study conducted by The Royal Society for Public Health in the UK in 2017 showed that Instagram usage had the biggest potential impact in relation to higher levels of anxiety, depression, bullying and “fear of missing out.”
It may take some time to get right, and there could potentially be unintended impacts, but Instagram's latest moves on this front are worthy of encouragement.
Follow Andrew Hutchinson on Twitter