The 'Logan Paul Suicide Forest' Game Highlights a Bigger Problem in the Video Game Industry
Sometimes the news isn’t as straightforward as it’s made to seem. Matt Sloan, The Mighty’s contributing editor, explains what to keep in mind if you see this topic or similar stories in your newsfeed. This is The Mighty Takeaway.
Editor’s note: If you experience suicidal thoughts or have lost someone to suicide, the following post could be potentially triggering. You can contact the Crisis Text Line by texting “START” to 741741.
Following the controversy surrounding Logan Paul’s “Suicide Forest” video, Felipe Orion, an independent games developer from Brazil, has received backlash for creating a suicide-themed video game based on Paul’s controversial video, which showed the body of a man who had died by suicide in Japan.
Suicide can have a place in video games — similar to the way it can be covered in other forms of media — but it needs to be executed sensitively. We don’t need to eradicate suicide from video games, but we do need to treat the topic with respect and care. Disrespectful representations that make light of suicide, like Orion’s game, shouldn’t exist — just like Logan Paul’s original video should never have been created.
Appearing on the Xbox Marketplace as part of Microsoft’s self-publishing Creators Program, “Logan Paul in the Suicide Forest” has players enter Aokigahara Forest as Logan Paul, hunting for bodies “in order to get millions of views.” Once the player finds a body, Logan Paul’s avatar is shown dancing while the YouTube views pile in.
The game, which Orion told Gizmodo was created in four days as a “social experiment,” has received a fair amount of criticism from players and those in the games industry. Bear Parker, global community manager for Lego game developer TT Games, shared a concerned tweet about it. To which Microsoft’s Larry Hryb replied, “This is being taken care of.”
Is this a joke? Just decided to check out the Creators Collection on the Xbox Marketplace and found a free, PEGI 3 rated game based on Logan Paul’s trip to Aokighara Forest… pic.twitter.com/i5XIW7OLvn
— ???? (@Bearskopff) January 30, 2018
When Microsoft first announced its self-publishing platform in 2017, the company confirmed it wouldn’t hold developers to concept approvals, but reserved the right to remove “harmful or inappropriate content” from its store.
As the producer of the second best-selling console of the current hardware generation, Microsoft owes it to its fans to do better. Microsoft’s store policies state that content should not “glamorize extreme or gratuitous violence, result in discomfort or any other harm to end users” or “contain potentially sensitive or offensive content.” These guidelines explain its decision to remove “Logan Paul in the Suicide Forest” from its marketplace, but why was it allowed in the first place? Rather than being reactive in their removal of inappropriate content, Microsoft should be proactive and ensure things like this never see the light of day.
By prioritizing ease of use for would-be developers, Microsoft has sacrificed the safety of an untold number of fans who could have been negatively affected by this game. It’s also appeasing a negative side of gaming culture that openly mocks serious issues such as trauma and suicide. As a society, we deserve better than this.
Upon reaching out to Microsoft for comment, a spokesperson told The Mighty:
This content violated our Store policies and has been removed. Users can report inappropriate content on the game’s product page or by sending a report directly to firstname.lastname@example.org. We have reviewed this incident and implemented changes to our processes.
When asked, in light of this game and others, what changes would be implemented, Microsoft declined to provide any further comment.
This isn’t the first time such a game has appeared in an online marketplace. Last week, “Logan Paul: Suicide Forest Run” was briefly available for Android users before Google deleted it. The developer Simo Mediator has since appealed Microsoft, asking the game to be reinstated. However, Google Play’s Developer Policy Center clearly states, “We don’t allow apps that lack reasonable sensitivity towards or capitalize on a natural disaster, atrocity, conflict, death or other tragic event.”
Another poorly executed game, “Suicide Simulator” is still available for purchase in Valve’s popular Steam store. According to the game’s description, the game “does NOT encourage, support or endorse real-life suicide in any shape or form.” But, despite the warning, the game also advertises, “have you ever wanted to kill yourself? Now you can!” Both this game and its reviews openly mock suicide.
Whether games like “Logan Paul in the Suicide Forest” are approved due to human error or poor implementation of publishers’ own policies, games that make light of suicide or show suicide can have dangerous implications. Because of the visual and sometimes visceral nature of video games, vivid depictions of suicide can be incredibly triggering for those vulnerable to suicidal ideation.
More than 50 research studies worldwide have found that certain types of news coverage can increase the likelihood of suicide. Because of this, guidelines for reporting on suicide advise against including suicide imagery or means. While Orion’s game does not constitute reporting per se, the game clearly shows the body of a person who died by suicide in the same manner and location as the man in Logan Paul’s original video, with no trigger warning.
Chris Maxwell, communications coordinator for the American Association of Suicidology, said when it comes to representing suicide in games, intention is important. “If mental health or suicide are potential topics in the game, then it shouldn’t be difficult to include resources or links to supports within games.”
Beau Pinkham, director of crisis invention services at The Crisis Center of Johnson County, echoed this. Pinkham — a firm believer that all video games, in some way, are art — said the suicide prevention field isn’t calling for censorship or telling artists how they should express themselves.
However, what we’re not seeing is creators in the video game industry reach out to people in our field — to inform themselves on what may or may not be a dangerous portrayal of suicide, to request specific resources that they could provide at the end of the game, or to even just have a better idea of what to say to concerned people who reach out to the developer themselves.
Given the rapidly expanding ability for smaller developers to get a video game on platforms like Steam and the Xbox Marketplace, Pinkham added that those in the suicide prevention field need to at least have conversations with developers. “If the safe messaging can’t happen in-game, it can happen in other ways.”
Despite creating a Logan Paul-themed game, Orion told The Mighty he’s no stranger to the world of mental illness, sharing that he too has had suicidal thoughts. “Unfortunately the world is not a safe zone,” he said. “The world is a hard place where you face unpleasant things daily. If [sic] wasn’t my game, eventually [someone affected by this game] would be exposed to other forms of media that contains … trauma.”
While this is a common argument among those who create intentionally shocking and insensitive media, his excuse is akin to standing back in the midst of a crisis and waiting for somebody else to act. If we want to eradicate insensitivity, we must educate people rather than pointing a finger and saying “Well, he did it. Why can’t I?”
“The game was created purely to experiment how the gaming community and the market would react to a game like that,” Orion said, adding:
When I created this game I didn’t had [sic] the suicide topic in mind. If I had to make a game about the suicide itself, the game would be totally different. The obstacles would be based in typical problems that real people with depression faces and if the player managed to get to the end they would see that it’s totally possible to get rid of depression in real life because they already would’ve done it in the game.
That’s it bros!
The Suicide Forest is officially down in all platforms it was released! hahah
I’m really, really glad with the results I got with this game. I received to much support from the gamers that made me see I’m in the right direction. Many new games will come! pic.twitter.com/1i2N0uWCcJ
— Felipe Orion (@FelipeOrion_) February 1, 2018
This isn’t to say we can’t have video games that broach the topic of suicide. The widely-praised visual novel “Actual Sunlight” depicts a character with depression and thoughts of suicide. In “Life is Strange,” main player character Max can save a fellow classmate, Kate, from suicide only if you (the player) choose to invest time getting to know her, thus being able to say the things she needs to hear to help her through a difficult time.
In the universally-acclaimed “The Last of Us,” a character dies by suicide at the end of the game’s second act. It should be noted that “The Last of Us” is a post-apocalyptic story-driven game concerned with loss and survival. The character in question lost his brother, the person he was supposed to protect, to the game’s catastrophic pathogen a short time before.
These examples are, for the most part, by larger developers who need to get their games pre-approved. This shows that companies can sensitively portray suicide in games when held to a higher standard. More and more tech companies are upgrading their platforms to detect and remove harmful content sooner. If Facebook can upgrade its AI to detect suicidal ideation, why can’t these publishers implement their own protocols and detect potentially sensitive games? Technology like this would allow budding game developers to express themselves freely without content approval, and let publishers enforce the policies they have already in place. If the games industry is to grow, then taking greater care to moderate potentially triggering content must also be enforced.
If you or someone you know needs help, visit our suicide prevention resources page.
If you need support right now, call the National Suicide Prevention Lifeline at 1-800-273-8255, the Trevor Project at 1-866-488-7386 or reach the Crisis Text Line by texting “START” to 741741.
Image via YouTube