TikTok Once Again Comes Under Scrutiny Over Extreme Moderation and Censorship

While there’s no denying the popularity of TikTok, and the potential for it to become a major player in the social media landscape, the platform still has a way to go to maximize its prospects. And its own moderation processes are continually proving to be a stumbling point – while monetization, and providing a pathway for creators to generate income on the platform, should be TikTok’s biggest challenge at this stage, its links to more stringent Chinese Government policies, and odd initiatives to reduce cyberbullying, keep resurfacing, raising more questions about just how independent TikTok is, and what its obligations to the Chinese regime could be in a broader sense.

The latest on this front comes via The Intercept, which has uncovered a new set of internal moderation guidelines from TikTok.

As per The Intercept:

“The makers of TikTok instructed moderators to suppress posts created by users deemed too ugly, poor, or disabled for the platform, according to internal documents. These same documents show moderators were also told to censor political speech in TikTok livestreams, punishing those who harmed “national honor” or broadcast streams about “state organs such as police” with bans from the platform.”

That’s pretty extreme – and what’s worse, the team from The Intercept has published the documents, with these exact regulations spelled out:

That’s not a good look for the company – which, as you may recall, just recently announced plans to open up a new Transparency Center in the US, which would enable third-parties to come in and see its moderation teams at work.

I’d hazard a guess that these rules will not be on display to any visitors on-site.

In response, TikTok has said that “most of” these guidelines, which relate specifically to live-streams on the platform, are either no longer in use, or never were in the first place.

On the policies relating to people’s appearance, TikTok’s Josh Gartner told The Intercept that these rules: 

“…represented an early blunt attempt at preventing bullying, but are no longer in place, and were already out of use when The Intercept obtained them.”

Which is what TikTok said last December when Netzpolitik published another section of the platform’s internal guidelines which showed that TikTok had actively limited the reach of content that had been uploaded by users who appeared to have disabilities.

In that instance, the documentation advised TikTok moderators to flag content from users who appeared to have autism, Down’s syndrome or facial disfigurements. The reach of their uploads was subsequently restricted. The intention of the policy, in that case, was again to “protect” users who were deemed to be “susceptible to harassment or cyberbullying based on their physical or mental condition”.

So TikTok’s internal guidelines, which The Intercept notes were published in 2019, actively limited the reach of content posted by users which its moderation teams deemed to be:

  • Overweight
  • Ugly
  • Elderly
  • Poor
  • Disabled
  • LGBT

In order to, ostensibly, reduce cyberbullying.

But as you can see in the above example notes, the rulings on content featuring ugly or overweight people relates to: 

“If the character’s appearance is not good […] the video will be much less attractive, not worthing [sic] to be recommended to new users”

So TikTok’s explanation of it being an anti-bullying measure are also questionable.

And this is before we even get to the Chinese Government interference.

In a second document obtained by The Intercept, it outlines TikTok’s specific regulations around anti-government speech, and how such is to be handled:

Any broadcasts deemed by ByteDance’s moderators to be “endangering national security” or even “national honor and interests” were punished with a permanent ban, as were “uglification or distortion of local or other countries’ history,” with the “Tiananmen Square incidents” cited as only one of three real world examples.”

This is similar – if not the same – as TikTok moderation instructions obtained by The Guardian last September, which detailed how the app’s team had been instructed to censor videos which mentioned Tiananmen Square, Tibetan independence, or the banned religious group Falun Gong.

In fact, TikTok says that both of these new discoveries relate to the same documents referred to in these past investigations, and in both cases, the regulations are no longer in operation. TikTok was less clear on which specific elements are no longer being applied, but the company has been working to distance itself from these rulings in recent months, in order to reassure users that their personal data is safe, and that TikTok operates under separate guidelines to the Chinese version of the app, ‘Douyin’, with entirely independent management and systems.

That’s part of the reason why TikTok is looking to open a US-based transparency center, to show that it’s not being governed by Chinese regulations, while TikTok is also looking to remove all Chinese-based moderators from non-Chinese content rulings, further separating its teams.

TikTok knows that it needs to appease such concerns. Already, the app is under national security review in the US, while the Australian Strategic Policy Institute says that TikTok is “a vector for censorship and surveillance”. Reddit CEO and co-founder Steve Huffman also recently labeled the app as ‘spyware’ in relation to the ways in which it tracks user activity.

TikTok has seen major growth in various markets, with the app now close to hitting 2 billion installs worldwide, but in order to maximize the attention its winning, it needs to be able to convert those users into dollars, either through advertising, which will be increasingly targeted, or eCommerce integrations. Both elements will call for a degree of trust in how user information is being tracked, and while concerns remain around its processes, the platform will continue to face challenges, and potential restrictions impeding its progress.

In some ways, given that these new reports relate to the documents previously reported on, TikTok has already begun moving past these issues. But the concerns here are very real – TikTok, as late as mid-2019, was applying extreme moderation rulings on content, and limiting free speech within the app.

That is a significant concern for regulators, and will likely prompt some businesses to reconsider their usage of the app.

TikTok is working to resolve the various questions, and improve its systems. But this latest report shows that it still has some work to do on this front. 

Like this article?

Share on Facebook
Share on Twitter
Share on Linkdin
Share on Pinterest

Leave a comment

Why You Need A Website

Now