TikTok Updates Community Guidelines and Safety Tools to Better Protect Vulnerable Users

TikTok has announced a range of new updates to its Community Guidelines as it seeks to evolve its rules and tools in line with increasing usage.

TikTok has been heavily scrutinized over its approaches to content moderation, and what it allows from young users, and has even been banned in some regions over ‘immoral and indecent‘ video clips. This is a key area of concern – according to reports, around a third of TikTok’s US users are under 14 years of age, and even the most followed TikTok star, Charli D’Amelio, is only 16.

Protecting young, vulnerable people is critically important, and with so many youngsters flocking to the app, it’s rightfully become the focus of regulatory groups looking to ensure adequate protections are in place.

In line with this, TikTok has today announced new measures on four key fronts:  

  • TikTok has updated its policies around self-harm and suicide content, including self-injury behaviors. “Our policy on eating disorder content also has additional considerations to prohibit normalizing or glorifying dangerous weight loss behaviors.” TikTok says that it’s consulted with various mental health experts to update its guidelines in these key areas.
  • TikTok’s also updated its policies around bullying and harassment, with more explicit guidelines around the types of content and behaviors that will not be tolerated in the app – “including doxxing, cyberstalking, and a more extensive policy against sexual harassment”.
  • It’s also outlined new regulations on content that promotes ‘dangerous dares, games, and other acts that may jeopardize safety’
  • Lastly, TikTok has also updated its approach to dangerous individuals and organizations, focusing ‘more holistically on the issue of violent extremism’. TikTok may seem like a lesser concern than, say, Facebook in this respect, but reports have shown that dangerous movements, like QAnon, have also increasingly targeted TikTok as the platform has grown

The changes are now listed in an updated version of TikTok’s Community Guidelines, which are available here. How TikTok enforces these new regulations remains to be seen, but as a first step, the updates cover some critical elements of concern, and should see improved action from the app.

In addition to this, TikTok also notes that it’s looking to improve its inclusive tools by adding a new text-to-speech feature which will enable users to convert typed text to voice “that plays over text as it appears in a video”. 

Interestingly, TikTok has seen an influx of videos of late which include voice-to-text translation overlaid on the screen. That feature is actually facilitated via Instagram’s Threads app, not within TikTok itself, which has lead to a surge in downloads of Threads. But people aren’t looking to send those videos within Threads itself, they’re uploading them to TikTok, which seems to have artificially inflated Threads usage somewhat. An interesting, tangentially related, aside.

TikTok’s also adding some new mental health support and assistance tools within the app:

“Over the coming week, we’ll roll out updated resources to support people who may be struggling. These resources were created with guidance from leading behavioral psychologists and suicide prevention experts, including ProvidenceSamaritans of SingaporeLive for Tomorrow, and members of our US Content Advisory Council. Now, if someone searches for terms like “selfharm” or “hatemyself” they’ll see evidence-based actions they can take.”

TikTok help platform

While it’s also looking to reduce exposure to potentially disturbing video clips by hiding them behind a new warning screen. 

“We’re introducing opt-in viewing screens on top of videos that some may find graphic or distressing. These types of videos are already ineligible for recommendation into anyone’s For You feed, and this feature aims to further reduce unexpected viewing of such content by offering viewers the choice to skip the video or watch it.”

And lastly, TikTok’s looking to play its part in ensuring optimal take-up of the coming COVID-19 vaccine by adding a new FAQ section about the vaccine, specifically, to its COVID-19 resource hub.

TikTok COVID-19 Resource Hub

TikTok says that its COVID-19 info tool has been viewed over 2 billion times globally over the last six months, underlining both the platform’s growth and its potential influence on user behavior. The new vaccine updates will ideally add to this, and help TikTok distribute relevant, accurate information to its users.

As noted, these are some important areas of focus for TikTok, specifically, and it’s good to see the platform roll out these new changes ahead of the holiday period, in which usage of the app will spike. The next step is to ensure TikTok’s moderation teams are enforcing the new updates as listed, which is always difficult as a platform scales.

But the updates here point to the right elements, and hopefully they’ll help TikTok evolve its approach to such in-step. 

Like this article?

Share on Facebook
Share on Twitter
Share on Linkdin
Share on Pinterest

Leave a comment

Why You Need A Website

Now