Changes to Instagram’s self-harm content rules follow a comprehensive review involving specialists and academics from around the world on youth, mental health, and suicide.
The move came after British Health Secretary Matt Hancock met social media corporations concerning doing more to safeguard the mental state of teenagers using their platforms.
British teenager molly Russell was found dead in her bedroom in 2017. The 14-year-old had apparently taken her own life, and her Instagram account reportedly revealed she followed accounts associated with depression and suicide.
“It is encouraging to see that decisive steps are now being taken to try to protect children from disturbing content on Instagram,” said the girl’s father, Mr. Ian Russell.
“It is currently time for other social media platforms to take action to recognize the responsibility they too have to their users if the internet is to become a secure place for young and vulnerable individuals.”
Changes to Instagram’s self-harm content rules follow a comprehensive review involving specialists and academics from around the world on youth, mental health, and suicide, according to chief executive Adam Mosseri.
“Over the past month, we have seen that we are not where we need to be on self-harm and suicide, and that we need to do more to keep the most vulnerable people who use Instagram safe,” Mr. Mosseri said in a web post.
“We won’t allow any graphic pictures of self-harm, such as cutting on Instagram – even if it would previously have been allowed as admission.”
Instagram has never allowed posts that promote or encourage suicide or self-harm.
The Facebook-owned service is removing references to non-graphic content associated with people hurting themselves, such as healed scars, from search, hashtag, explore, or recommendation features.
“We are not removing this type of content from Instagram entirely, as we don’t want to stigmatize or isolate people who may be in distress and posting self-harm related content as a cry for help,” Mr. Mosseri said.
Facebook rolls out updated tools to assist stop suicides and self-harm in Singapore.
Trend of younger kids self-harming, experts warn.
Instagram also planned to ramp up efforts get counselling or other resources to those that post or search for self-harm connected content.
“During the comprehensive reviews, the specialists, as well as the Centre for mental health and Save.org reaffirmed that creating safe areas for young people to talk concerning their experiences – as well as self-harm – online, is crucial,” Mr. Mosseri said.
“However, collectively it was advised that graphic images of self-harm – even when it’s someone admitting their struggles – has the potential to unintentionally promote self-harm,” he continued, citing it as the reason for the ban.
Instagram’s aim is to eliminate graphic self-injury or suicide related imagery and significantly downplay related content in features at the service whereas remaining a supportive community, in line with Mr. Mosseri.
On Thursday, Mr. Mosseri joined representatives from Facebook, Google, Snapchat, Twitter and other corporations who met Mr. Hancock to debate handling of content associated with self-injury or suicide.
“What really matters is when youngsters are on these sites, they’re safe. The progress we made today is good, however there’s plenty more work to do,” Mr. Hancock said after the meeting.
“What all the companies that I met today committed to was that they want to solve this problem, and they want to work with us about it.”