The Mighty Logo

TikTok Reportedly Discriminated Against People With Disabilities to Prevent Bullying

The most helpful emails in health
Browse our free newsletters

A new report uncovered how popular social media platform TikTok systematically discriminated against content made by people with disabilities as an attempt to prevent bullying on its platform. This censoring of people with disabilities and mental health conditions, however, occurs across almost all social media platforms, including YouTube, Pinterest and Instagram.

On Monday, Netzpolitik published its findings on TikTok’s moderation guidelines. Along with people with bigger body sizes and members of the LGBTQ community, TikTok’s moderation team allegedly marked the accounts of users with disabilities as vulnerable. The move limited their audience reach by millions of potential viewers and, in some cases according to Netzpolitik, rendered videos as good as invisible. TikTok moderation guidelines specifically called out autism, Down syndrome and facial differences, decisions made based on a user’s hashtags or a moderator’s opinion of a 15-second video clip.

Moderation screenshot from TikTok guidelines from Netzpolitik

Screenshot via Netzpolitik

TikTok claims its moderation techniques aim to protect users from cyberbullying before it even occurs. However, Netzpolitik pointed out de-prioritizing videos means disabled users don’t gain viewership and, therefore, don’t have the same opportunity to build an audience that their typical peers do. Whether or not limiting the reach of videos protects people from online bullying is unclear, but TikTok’s decision to subjectively de-prioritize content from people with disabilities across the board is clear discrimination.

In response to Netzpolitik’s investigation, a TikTok spokesperson told the outlet the social media platform has since updated its moderation policies and its original policy was “never intended to be a long-term solution.” A statement from TikTok provided to the Verge added:

Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy. While the intention was good, the approach was wrong and we have long since changed the earlier policy in favor of more nuanced anti-bullying policies and in-app protections.

While TikTok is the latest social media platform in the hot seat for controversial moderation policies, it is not the only company that uses a heavy-handed approach to attempt to create a safe environment for users with health conditions. In March, YouTube rolled out a policy designed to reduce the predation of minors on its platform by turning off comments on some channels YouTube deemed vulnerable to predators. The YouTube channel for the nonprofit Special Books By Special Kids (SBSK), which features kids with disabilities, was one such account.

When SBSK founders Chris Ulmer and Alyssa Porter tried to negotiate with YouTube to turn their comments back on, they met a dead end. They appealed for public support, highlighting that YouTube comments and social media serve as a crucial forum for people with disabilities and their families to connect and support each other. For YouTube to only see SBSK’s channel as high-risk amounted to discrimination, according to Ulmer and Porter.

“Now we cannot have a conversation on disability when it is already stigmatized,” Porter said. Ulmer added, “This is complete censorship against the disability community. It is discrimination.”

#UnsilenceSBSK

Today we need your help. YouTube has taken the discriminatory action of disabling all of SBSK's comment sections despite their overwhelming positivity. We need your assistance in persuading them to reverse this decision that will be extremely harmful to our movement towards inclusion of disability.Please take a brief moment to sign our petition at bit.ly/SBSKpetition.#UnsilenceSBSK

Posted by Special Books by Special Kids on Thursday, March 14, 2019

In addition, tweaking social media algorithms to hide content from people with health conditions may contribute to a sense of isolation. Often the internet becomes a source of comfort and connection, especially for people who may be the only wheelchair user in their school or mentally ill person in their workplace. When people with health conditions can’t see themselves represented on social media, this becomes its own risk factor.

“It may just be a ‘comment section’ to some but as a community who follows, loves, respects and appreciates what this channel is doing we all need to be able to communicate by voicing our opinions and support one another,” Phuong Vu, a follower of Ulmer’s, wrote in a Facebook comment on SBSK’s page. “The only way we can do this is with this one little ‘comment section’ which is a big thing to us.”

Pinterest also implemented a new way to handle searches for mental health-related content such as stress, anxiety and self-harm on its platform in July. It began serving up self-care content created in collaboration with several mental health nonprofits while de-prioritizing thousands of terms related to content such as self-harm. Such a blanket policy, though intended to protect vulnerable users and connect more people with supportive mental health resources, also means silencing people who live with mental illness.

For people with health conditions, there’s another discriminatory side to social media: Its potential use for surveillance. In April, for example, the Trump administration proposed an ableist (and ill-informed) idea to monitor the social media accounts of people who receive disability benefits to “catch” people who are faking it and reduce disability fraud. Separately, in an attempt to reduce school shootings, Florida designed a database to track students with mental health issues and watch what they post on social media, an act advocates call discriminatory.

“We already have a large problem of students with disabilities and minority students being disproportionally disciplined, arrested, suspended, expelled, restrained, isolated and [involuntarily civilly committed] and pushed out of public school,” Stephanie Langer, Esq., staff attorney for the Disability Independence Group, previously told The Mighty. “These provisions make it worse for these students while doing nothing to prevent another mass shooting or keeping any other students safe.”

In an attempt to curb bullying against people with disabilities or mental health conditions, social media platforms punish the theoretical victims instead of the perpetrators. And when people with health conditions or their loved ones do report abuse online, social media platforms don’t have adequate systems in place to respond — or they refuse to take action.

Natalie Weaver, whose late daughter Sophia lived with Rett syndrome and a facial difference, often received hateful and abusive comments in response to her daughter’s photos. She would report the comments or accounts to Instagram, and the Facebook-owned platform did nothing. It did not find accounts in violation of its policies even when users told Weaver to “drown her” daughter. Due to Instagram’s unclear moderation policies, Weaver lobbied Instagram to create a tool to report disability-specific bullying and threats on the platform.

“Instagram’s hate speech does not include anything to protect the disability community,” Weaver previously told The Mighty. “I want the social media giant to take some responsibility to protect their users from being told to kill themselves or their children.”

To connect with a community that cares — and creates a space for safe, open dialogue — we encourage you to post a Thought or Question on The Mighty with the hashtag #CheckInWithMe to give and get support from others who understand.

Header image via natanaelginting/FreePik

Originally published: December 3, 2019
Want more of The Mighty?
You can find even more stories on our Home page. There, you’ll also find thoughts and questions by our community.
Take Me Home