TikTok Updates Moderation Guidelines Around QAnon and Harmful Targeting of Minorities

After a recent report highlighted flaws in TikTok’s approach to removing QAnon-related content, despite the platform’s rules against such, the company has today outlined additional steps that it’s taking in order to combat hate speech, and protect users from manipulation and misinformation.

As explained by TikTok:

“In co-operation with academics and experts from across the globe, we regularly assess and evaluate our enforcement processes to ensure that we are supporting our community and protecting against new risks as they emerge. As part of our efforts to prevent hateful ideologies from taking root, we will stem the spread of coded language and symbols that can normalize hateful speech and behavior.”

This comes after Media Matters conducted an investigation which identified 14 QAnon-affiliated hashtags and variations which had proliferated on TikTok, despite being outlawed in the platform’s community guidelines.

Media Matters found that TikTok users were getting around the ban by using less specific hashtags:

“The hashtags “#RedOctober,” “#TheStormIsComing,” “#TheStormIsHere,” and “#TheStorm,” are four examples of how TikTok is being used to circulate the QAnon conspiracy theory.”

TikTok has since removed 11 of the 14 tags identified in the Media Matters report, and will now seek to improve its enforcement efforts through improved education as to how such groups are adjusting, and working to circumvent its rules.  

In addition to this, TikTok’s also working to remove more types of misinformation and harmful stereotypes about Jewish, Muslim and other communities.

“This includes misinformation about notable Jewish individuals and families who are used as proxies to spread antisemitism. We’re also removing content that is hurtful to the LGBTQ+ community by removing hateful ideas, including content that promotes conversion therapy and the idea that no one is born LGBTQ+.”

In addition to this, TikTok’s also working to educate its moderation teams on the use of re-claimed language by minority groups, in order to ensure that the use of terms which could be construed as slurs, but are actually now being used in a way of empowerment, are not removed.  

“We’re working to incorporate the evolution of expression into our policies and are training our enforcement teams to better understand more nuanced content like cultural appropriation and slurs. If a member of a disenfranchised group, such as the LGBTQ+, Black, Jewish, Roma and minority ethnic communities, uses a word as a term of empowerment, we want our enforcement teams to understand the context behind it and not mistakenly take the content down. On the other hand, if a slur is being used hatefully, it doesn’t belong on TikTok.”

TikTok has faced various challenges with its content moderation efforts, and rightly, much scrutiny over its efforts to protect young users. That likely makes these new initiatives event more important – according to reports, around a third of TikTok’s US user base is aged under 14, giving the app reach to a hugely impressionable audience, where such theories and movements could take root and expand.

TikTok’s taking on the challenge, and has responded to the Media Matters report, and that, hopefully, will help the platform better manage moderation concerns, and reduce the spread of harmful movements.

Like this article?

Share on Facebook
Share on Twitter
Share on Linkdin
Share on Pinterest

Leave a comment

Why You Need A Website

Now