As part of its ongoing effort to better protect its primarily young audience from the dangers of online exposure, TikTok has today announced that it’s signed up to the Technology Coalition, an organization that works to protect children from online sexual exploitation and abuse.
Formed in 2006, the Technology Coalition is comprised of a range of representatives from tech industry giants who work together to formulate optimal approaches in detecting concerning trends, and protecting young users.
As per the Technology Coalition website:
“We seek to prevent and eradicate online child sexual exploitation and abuse. We have invested in collaborating and sharing expertise with one another, because we recognize that we have the same goals and face many of the same challenges.”
Already, representatives from Facebook, Twitter, Google and Snap Inc. are members of the group, while Pinterest and Discord have also recently signed on.
Joining the Coalition will provide TikTok with more policy guidance, in order to facilitate a safer user environment, specifically for young users.
As explained by TikTok:
“We’re at our strongest when we work together, which is why we’re proud to join the Technology Coalition. Through this membership, we hope to deepen our evidence-based approach to intervention and contribute our unique learnings from addressing child safety and exploitation. TikTok is also joining the board of the Technology Coalition along with a number of committees that aim to advance protections for children online and off and drive greater transparency of evolving threats to child safety.”
This is a particularly sensitive element for TikTok, given its younger user slant.
Indeed, last year, The New York Times shared an internal report from TikTok which indicated that more than a third of its US audience is aged under 14. Users under 13 are technically not able to access ‘the full TikTok experience’, with a limited, younger version of the app available based on your registered profile age. But it’s also likely that many children are still getting around this, and signing up to the wildly popular app.
Given this, TikTok has an obligation to protect these users however it can, and various incidents have suggested that the platform is still working to refine its approaches, and ensure youngsters are kept safe.
Last year, for example, TikTok was temporarily banned in Pakistan due to ‘immoral and indecent’ content, while the app was also blocked in Italy for a period due to claims that a 10 year-old girl had died after taking part in a “blackout challenge” in the app, which saw users choking themselves in their clips. The European Commission is also investigating claims that TikTok exposes young users to inappropriate content.
Given these, and other incidents, this needs to be a key priority for the app, and signing on to the Technology Coalition is another advance on this front, which could help TikTok improve its approaches.
Really, the app seems rife for exploitation, with young girls, in particular, incentivized to perform for the camera in order to secure more likes and follows, and increase their social standing.
Hopefully, through this partnership, TikTok will be able to refine and reinforce its approaches, and provide more protection for younger users in the app.