Why Reporting Someone Who's Suicidal on Twitter Might Do More Harm Than Good


Social media platforms like Instagram, Facebook and Twitter feature tools to help users report content that breaks their community guidelines with the goal of keeping these spaces safe from hate speech and threats.

But, users can use these same features when someone they follow expresses thoughts of suicide or self-harm. In a viral thread, one Twitter user shared why reporting someone who expresses suicidal thoughts online can hurt more than it helps.

Lizzy, whose Twitter handle is @passingtrain, shared what happened to her when her own account was reported.

“PSA: If a person on Twitter suggests that they are suicidal, DO NOT report the account. It does absolutely nothing to help, and in many cases, you are actively making the situation worse,” she wrote.

Twitter operates on a “three-strike” rule for reports of self-harm and suicide. After being reported for the third time, Lizzy was locked out of her account for a week.

Strike one results in an automated email from Twitter, strike two is a 24-hour lockout from your account, and strike three is a 7-day lockout. Here’s the email that Twitter sends a user after their first strike. Nowhere does it mention that any further reports of suicide and/or self-harm will lead to a temporary lock being placed on the user’s account.

When an account is locked, you can’t send tweets, like or retweet, or follow other users, although you can look at your Twitter timeline and send direct messages to people who follow you.

“This effectively silences the person who may be at risk of harm,” Lizzy wrote, adding:

Locking somebody out of their account features because they are ‘encouraging or contemplating suicide or self-harm’ does nothing but take away a means of communication from somebody who may be in a vulnerable situation.

If this is a person who has tweeted about suicide before, there is a strong chance that they may have already been reported by another user and therefore already on a strike. Reporting them again has the potential of isolating them even further.

Lizzy told The Mighty she wanted to share what happened to her because Twitter treats contemplating suicide and encouraging suicide as one issue. While trolls online are known for telling people to “go kill themselves,” people expressing general distress are typically in need of support, which some seek online.

“They’re clearly very different in how they should be approached,” she said. “These issues should be separated on the report form, and encouraging suicide/self-harm (such as telling other users to harm themselves) should warrant a warning and/or suspension.”

In her thread, Lizzy said Twitter’s policy also doesn’t account for people who aren’t suicidal, but who are going through a difficult time. Tweeting “kill me now” got her banned for a week.

When asked for comment about Lizzy’s thread, a Twitter spokesperson told The Mighty, “We did temporarily limit the account but reversed the decision after reviewing the content in more detail,” adding they use a combination of human and technology to review content that gets reported.

On a page that explains their rules and policies about “glorifying self-harm and suicide,” Twitter explains that while it wants people to feel safe sharing their thoughts and feelings, “we draw the line at encouraging or promoting self-harm and suicidal behavior, which can pose safety risks for others.”

While images of self-harm and descriptions of suicide methods can be triggering for those who are struggling, simply talking about suicide in general, or discussing your own suicidal thoughts, has not been proven to encourage suicidal behaviors in others.

As for what Twitter can do better, Lizzy suggests they treat encouraging suicide and expressing suicidal thoughts as two different issues. Instead of putting a user through the same reporting system made for threats and dangerous content, Twitter could send an automated message with resources directly to a user’s inbox.

“Nobody should be made to feel as though they are the problem when they are struggling the most,” she said. “Locking a user’s account cuts them off from those who may be able to support and also implies that the user should be punished for speaking openly about their mental state. It does nothing to help but could potentially cause immense harm.”

Getty image via simonmayer


Find this story helpful? Share it with someone you care about.