As the US tech giants weigh a possible takeover offer for the rising video app, TikTok has this week announced a new set of measures to better detect and remove misinformation on its platform, while it’s also, interestingly, working with experts from the U.S. Department of Homeland Security to protect against foreign influence.
Interesting, in this sense, because TikTok has been identified by some experts as a distribution platform for pro-China propaganda – which, of course, TikTok is very keen to pushback against.
First off, on misinformation – as noted by TikTok, it already has measures in place to:
“…prohibit misinformation that could cause harm to our community or the larger public, including content that misleads people about elections or other civic processes, content distributed by disinformation campaigns, and health misinformation.”
Now, with the US Presidential Election looming, the platform is looking to beef up those efforts:
- We’re adding a policy which prohibits synthetic or manipulated content that misleads users by distorting the truth of events in a way that could cause harm.
- We’re making our policy around coordinated inauthentic behavior more transparent. Our Community Guidelines already prohibit content from disinformation campaigns, and this addition makes our stance against coordinated inauthentic behavior unambiguous.
So, TikTok will now have more specific guidelines against the use of deepfakes and coordinated, strategic use of the platform to influence opinion. The latter form of campaigning would be more difficult to operate on TikTok either way, due to the way its algorithm works, but TikTok’s looking to ensure it has clear enforcement measures in place to combat such, in case it becomes a problem.
In addition to this, TikTok also notes that it does not accept political ads:
“The nature of paid political ads isn’t something we think fits with the experience our users expect on TikTok.”
TikTok also notes that it’s expanding its partnerships with PolitiFact and Lead Stories to improve its fact-checking processes, with a specific focus on US election misinformation, while it’s also adding an election misinformation option to its in-app reporting feature, enabling users to easily report content or accounts for review.
And TikTok is also working with the Department of Homeland Security’s Countering Foreign Influence Task Force, sharing information and insights with the group to improve its detection methods.
The announcements make sense, and are in line with what most other social platforms are implementing on this front – though as noted, it’s slightly different in TikTok’s case due to the position it’s in with regard to its connections to the Chinese Government.
As a Chinese owned business, TikTok is bound by China’s cybersecurity laws to share its data with the CCP on request. TikTok has repeatedly noted that it does not share data with the Beijing, but it can’t say that it won’t have to do so in future, if requested. Because it will – which is why the US Government is now pushing for TikTok to be separated from its Chinese roots, or face a ban in the US.
According to a report from The Australian Strategic Policy Institute, published late last year, TikTok is also “a vector for censorship and surveillance.”
“Unlike Western social media platforms, which have traditionally taken a conservative approach to content moderation and tended to favor as much free speech as possible, TikTok has been heavy-handed, projecting Beijing’s political neuroses onto the politics of other countries.”
That makes TikTok’s approach to political interference more significant, in that some already view the app itself to be facilitating the spread of political messaging.
It’s difficult to prove such either way, which, again, is why the US Government has pushed for the app to be sold off, or cut off.
Given this, TikTok is clearly saying the right things, and taking the right steps – and hopefully these new measures do protect against election interference within the app. But its other concerns are clearly a much bigger problem – and we’re not near hearing the last on them.