After announcing a ban on graphic images of self-harm back in February, Instagram is now expanding its parameters to take in more types of self-harm related content, as it seeks to better protect vulnerable and at-risk users.
The new changes have been outlined in a blog post from Instagram chief Adam Mosseri, who has also explained the dangers of such content:
"Nothing is more important to me than the safety of the people who use Instagram, particularly the most vulnerable. Suicide and self-harm are difficult and complex topics that people understandably care deeply about. These issues are complicated – there are many opinions about how best to approach them — but they matter a lot, and to me, as a parent, they certainly hit home."
Instagram's initial moves on this front back in February lead to the platform developing new systems which are better able to detect images and content depicting self-harm and/or scars in the aftermath of such. Mosseri says that these systems have enabled them to significantly expand their action in removing such:
"As a result, we have been able to act on twice as much content as before. In the three months following our policy change we have removed, reduced the visibility of, or added sensitivity screens to more than 834,000 pieces of content. We were able to find more than 77% of this content before it was reported to us."
That same technology has now advanced to the point where Instagram can apply it to more self-harm related content, leading to this new expansion:
"This past month, we further expanded our policies to prohibit more types of self-harm and suicide content. We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery. We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods."
This is positive step for Instagram. The platform has become a key communications tool for younger users in particular, and those interactions can have a huge impact on mental wellbeing.
Indeed, Instagram, according to some reports, is the worst social network in this respect. A study conducted by The Royal Society for Public Health in the UK in 2017 showed that Instagram usage had the biggest potential impact in relation to higher levels of anxiety, depression, bullying and “fear of missing out.”
The focus on self-harm related imagery seems like a logical, and entirely beneficial step, with the restrictions having limited, if any, potential negative consequences.
In addition to these content removals, Mosseri has also noted that accounts which are found to be sharing this type of material will not be recommended in search or discovery surfaces, like Explore, adding to the potential impacts on those who seek to push the boundaries.
As noted, the research is fairly clear on the potential negatives of social media usage in regards to mental health, with Instagram, and its focus on visuals, being a key concern in this respect. The platform has acknowledged this, and is looking to take steps to prevent such.
This is not a problem, as Mosseri notes, that can ever be 'solved' as such, but the more that can be done to limit such impacts, the better.
You can read more about Instagram's latest bans on self-harm related content here.