Instagram Will Blur Images of Self-Harm Following Suicide of 14-Year-Old Girl
On Monday, Instagram announced it would be introducing “sensitivity screens” for images of self-harm on their platform. Instagram will blur images of self-harm until a user explicitly taps on the image, indicating they want to view the graphic content.
Currently, Instagram removes comments and disables accounts that encourage others to “embrace self-injury.”
Want to share your experience? Connect more easily with The Mighty’s mental health community by downloading our free app.
The update was created following 14-year-old Molly Russell’s suicide in 2017. Her father, Ian Russell, claimed Instagram was partly to blame because their daughter viewed graphic images of self-harm and suicide on the app.
“We are not yet where we need to be on the issues of suicide and self-harm,” Adam Mosseri, Instagram’s head of product, wrote in an op-ed for The Telegraph. “We need to do everything we can to keep the most vulnerable people who use our platform safe.”
Because of the way the Instagram algorithm works, some have pointed out that users who view or engage with images of self-harm may be directed to view more content like it.
Mosseri said Instagram engineers and content reviewers will be “working around the clock” to protect users on this front. He added:
We have engineers and trained content reviewers working around the clock to make it harder for people to find self-harm images. We have put in place measures to stop recommending related images, hashtags, accounts, and typeahead suggestions.
If you struggle with self-harm and you need support right now, call the crisis hotline at 1-800-273-8255 or text “START” to 741741. You can also post a Thought or Question on The Mighty to receive support from people who have been there. For a list of ways to cope with self-harm urges, visit this resource.
Do you think this update will be helpful? Let us know in the comments.
Header Image via Instagram Facebook page