As more regions consider more stringent usage restrictions for teen users, TikTok has outlined its evolving efforts to detect underage accounts, and limit teen exposure in its app.
In a new overview of its evolving efforts on this front, TikTok has explained how it’s now using AI age detection, among other measures, to keep youngsters safe in the app.
As explained by TikTok:
“In most parts of the world, the minimum age to use TikTok is 13. We use a multi-layered approach to confirm someone's age or detect when they may not actually be the age they say they are.”
These measures include the basics, like its requirement for adding a birth date when setting up a profile in the app:
“If someone fails to meet our minimum age, we suspend their ability to immediately re-create an account using a different date of birth.”
While TikTok also now uses an AI age-qualification process, which its expanding to more regions:
“We've been piloting new AI technologies in the U.K. over the last year and found they've strengthened our efforts to remove thousands of additional accounts under 13. We're planning to roll this technology out more widely, including in the EU, and are currently discussing it with our European privacy regulator.”
TikTok says that it also trains its human moderation teams to be alert to signs that an account may be used by a child under the age of 13.
“If [moderators are] reviewing content for another reason but suspect an account belongs to an underage user, they can send it to our specialized review team with deeper expertise on age assurance. Since judging age can be complex, our teams are instructed to err on the side of caution when making enforcement decisions. When in doubt, we will remove an account we suspect may be under 13. We also allow anyone to report an account they believe belongs to someone under 13. You don't even need a TikTok account to do this.”
This, in combination with TikTok’s restrictions on teen accounts (users under 16 can’t send DMs, while it also enacts default screen time limits for young users), has helped to ensure that TikTok is working towards more stringent, accurate detection measures, which limit potential harms in its apps.
Indeed, TikTok says that its detection processes see it remove around 6 million underage accounts globally every single month.
This is a key focus for the app, as it is for all social media platforms, because more and more regions are now considering new age restrictions on social apps, in order to limit negative impacts on users.
Over the past year, several European nations, including France, Greece and Denmark, have put their support behind a proposal to restrict social media access to users aged under 15, while Spain has proposed a 16 year-old access restriction.
Australia and New Zealand are also moving to implement their own laws that would restrict social media access to those over the age of 16, as is Papua New Guinea, while Norway is also developing its own regulations.
To be clear, all of the major social platforms currently restrict access to users aged 14 and up. So in technical terms, these proposals are not implementing some radical new requirement.
But where things are changing is in detection and enforcement, with these nations now looking to put more onus on the platforms themselves to improve their detection, at risk of massive fines if they fail to meet their requirements.
Though the challenge remains in establishing a universal, legally enforceable approach to age checking.
Right now, each platform is essentially going it alone, and working to implement their own best approach to restricting teen usage. But that’s not fair to all, as less resourced platforms are then being held to the same standards as the big players, while variable checking also presents enforcement challenges, in that there’s no industry standard which can be upheld as a universal requirement.
TikTok acknowledges this, and has been working to share information on its approach with industry peers.
“Since its first session last year, TikTok has engaged in the Global Multistakeholder Dialogue on Age Assurance convened by the Centre for Information Policy Leadership (CIPL) and WeProtect Global Alliance. This dialogue aims to explore the complex challenges of age assurance and minor safety, whilst driving consensus across the sector. To that end, we have already started to explore whether the European Commission's planned age verification app could be an effective additional tool for us. However, for any solution to be truly effective, it's crucial to have a level playing field in which peer platforms are subject to the same regulatory requirements and are held to the same standards.”
This is the real challenge, and should be the real aim for those considering these legal updates, that there needs to be a more uniform standard for accurate and accountable enforcement, ensuring that all platforms are being held to the same standards.