In its short period of existence, TikTok has seen significant scrutiny over its moderation processes, and how it goes about protecting users by censoring certain content. Given the younger skew of TikTok’s audience, this is a critical element, and already TikTok has faced temporary bans in some regions over what it allows, or has allowed, on its platform.
Yet, at the same time, when TikTok does take action on a clip, many have been frustrated at the lack of explanation or transparency within that process. And with the platform now implementing even tougher rules in some regions, such enforcements will only increase – which is why today, TikTok has announced a new process which will highlight the specific rule/s that a removed video has violated, in order to help users understand why their video has been removed.
As explained by TikTok:
“For the past few months, we’ve been experimenting with a new notification system to bring creators more clarity around content removals. Our goals are to enhance the transparency and education around our Community Guidelines to reduce misunderstandings about content on our platform, and the results have been promising.”
TikTok says that, in testing, its updated notifications have reduced the rate of repeat violations, while visits to its Community Guidelines have nearly tripled.
“We’ve also seen a 14% reduction in requests from users to appeal a video’s removal. We believe this helps foster greater understanding of the kind of positive content and welcoming behavior that makes our community thrive.”
Based on these initial results, TikTok is now rolling out its updated notifications to all regions. Now, whenever a video is removed for violating the platform’s policies, the creator will get a specific explanation of which policy element was violated, along with an easy option to appeal the decision.
As you can see here, the new prompt provides a specific rule reference to provide more context. Users can tap on the ‘Submit an Appeal’ option at the bottom of the screen to challenge the ruling.
Additionally, when content is flagged as self-harm or suicide-related, TikTok will now also provide access to expert resources through a second notification.
As noted, maybe more so than other platforms, TikTok is being pressed to moderate concerning content in order to protect younger users, and avoid potentially harmful exposure. That’s pushing TikTok to err more on the side of caution in its processes, which will likely see more videos removed that maybe shouldn’t have been.
Given this, the update is a good way to address these concerns, and ensure that users understand the platform rules in order to avoid crossing the line, while also providing a means of recourse if needed.
You can read more about TikTok’s notification update here.