Bullying is one of the most critical issues in society, particularly among the younger and more impressionable, who are also the most prominent users of social media. That's why its good to see Instagram implementing more measures to protect its users, adding in new anti-bullying tools this week to advance its efforts.
Here's what's been announced.
First off, Instagram's adding a new automated warning which will prompt users when their comment may be considered offensive, giving them a chance to review before the post creator sees it.
As you can see in the example above, the new tool will be triggered by the use of certain words within comments, and will alert the user with a notification that asks if they're sure that's what they want to post.
As explained by Instagram:
"This intervention gives people a chance to reflect and undo their comment, and prevents the recipient from receiving the harmful comment notification. From early tests of this feature, we have found that it encourages some people to undo their comment and share something less hurtful once they have had a chance to reflect."
That quick re-check may be all that's needed - given that post comments have no vocal tone, it can be easy to unintentionally say the wrong thing. Taking a moment to consider how, exactly, the recipient may read it could help lessen this impact, and reduce accidental harm.
In addition to this, Instagram's also testing a new 'Restricted' mode, which will give users a way to lessen the impact of certain users without alerting them to any such limitation.
As explained by Instagram, the process will help users limit potential bullies without causing real-world angst:
"We’ve heard from young people in our community that they’re reluctant to block, unfollow, or report their bully because it could escalate the situation, especially if they interact with their bully in real life. We wanted to create a feature that allows people to control their Instagram experience, without notifying someone who may be targeting them. Soon, we will begin testing a new way to protect your account from unwanted interactions called Restrict. Once you Restrict someone, comments on your posts from that person will only be visible to that person. You can choose to make a restricted person’s comments visible to others by approving their comments."
The idea here is that by limiting the exposure of such comments, you lessen the harm - though it does still seem a little flawed in practice.
As you can see in the last frame of the example above, Restricted mode will mean that only you and the person you're restricting can see their comments. That means, for one, that the user will still be exposed to such. Reducing their reach to others is still significant, but the damage can still be done by getting through to the user his or herself.
The other concern with this approach is that it may not actually help in the way Instagram would hope.
As noted by Instagram, some users are hesitant to block or unfollow someone who they interact with in real life, and in such instances, the restricted user would likely also interact with other people they know. So if that person were to comment, and their comment didn't show up for other users in the same social group, they'd likely work out that they've been restricted. That could lead to similar repercussions as Instagram highlights in regards to existing measures.
At the same time, it is important for Instagram to try new things, and their internal testing has likely highlighted potential benefit. It seems problematic, but it may provide increased capacity for protection, and lessen such impacts.
And worth noting, restricted users also won’t be able to see when you’re active on Instagram, nor when you’ve read their direct messages.
The importance of such programs cannot be overstated - according to the American Foundation for Suicide Prevention, suicide is the 10th leading cause of death in the US, with almost 45,000 Americans dying by suicide each year. The reasons in each instance are complex, but various studies have shown links between social media use and depression, particularly in teens and young users. And definitely, online bullying - which can see victims subjected to such attacks non-stop, both in and out of school and other social settings - is a key problem we need to address.
In addition to this, various studies have also linked Instagram usage specifically to mental health concerns.
According to a survey conducted in 2017, Instagram is "the worst social media network for mental health and wellbeing", with the platform contributing to higher levels of anxiety and depression, among other issues. And when you also consider that teens regularly delete Instagram posts which don't get enough likes, it's clear that such metrics are contributing to these concerns.
Given these factors, it's important that Instagram does try new things, that it is looking to address such issues where it can. It's not possible to expect the platform to be able to eliminate such issues entirely, but the fact that it's investigating new options is a positive.
It may need some more refining before making its Restricted option work as intended, but it's another good starting point, which shows that Instagram is working to refine its processes.