- Instagram is banning all graphic images of self-harm.
- It follows the social network being blamed for the suicide of British teenager, Molly Russell.
- Instagram boss Adam Mosseri said "we are not where we need to be on self-harm and suicide."
- Banning graphic content is one of four steps he laid out to make the platform safer.
Instagram’s new boss has promised to ban all graphic images of self-harm after the social network was blamed for the suicide of a British teenager.
Adam Mosseri has been in London this week as reports about the death of Molly Russell have dominated the headlines after her father, Ian, said Instagram "helped kill my daughter."
Molly Russell was 14 when she took her own life in 2017. Following her death, her family found she’d been following numerous Instagram accounts featuring images of self-harm and suicide. The story was uncovered by the BBC.
In a blog post, Mosseri said "we are not where we need to be on self-harm and suicide" and made four pledges to help make Instagram a safer place for users. They are:
- Ban all graphic images of self-harm, such as cutting.
- Non-graphic images, such as healed scars, will be removed from search, hashtags, and the explore tab. They won’t be removed altogether because Instagram does not want to "stigmatize or isolate people who may be in distress and posting self-harm related content as a cry for help."
- More resources will be dedicated to getting help for people posting and searching for self-harm related content.
- Instagram will consult experts on further steps, such as potentially blurring any non-graphic self-harm related content with a sensitivity screen.
"During the comprehensive reviews, the experts, including the Centre for Mental Health and Save.org reaffirmed that creating safe spaces for young people to talk about their experiences — including self-harm — online, is essential. They advised that sharing this type of content often helps people connect with support and resources that can save lives," Mosseri said.
"However, collectively it was advised that graphic images of self-harm — even when it is someone admitting their struggles — has the potential to unintentionally promote self-harm. Which is why we are no longer allowing graphic images of self-harm."
During his time in London, Mosseri met key members of the British government: Culture Secretary Jeremy Wright and Matt Hancock, the health secretary. Both have promised to get tougher on social media giants, with Hancock suggesting that companies could be banned for not removing harmful content. The UK government will set out its position on internet regulation in an internet safety white paper next month.
For UK-based readers: If you or someone you know is struggling with depression, or has had thoughts of harming themselves or taking their own life, you can contact The Samaritans 24/7 on 116 123.
If you or someone you know is struggling with depression or has had thoughts of harming themselves or taking their own life, get help. The National Suicide Prevention Lifeline (1-800-273-8255) provides 24/7, free, confidential support for people in distress, as well as best practices for professionals and resources to aid in prevention and crisis situations.
- Mark Zuckerberg stoked anger by trying to deflect Facebook’s problems on ‘the internet’
- The inventor of Microsoft’s Hololens has left his job at Apple
- Instagram’s boss says it’s not doing enough to catch self-harm images before they are discovered by users
Source: Business Insider