Inside TikTok’s AI age-verification pilot under the EU’s DSA: how Yoti scans, GDPR safeguards and the coming Digital Identity Wallet reshape safety and privacy.

In December 20205, TikTok removed thousands of accounts belonging to users under the age of 13 as part of its EU age-verification pilot. The system analyses profile information, posted videos, and behavioural signals to comply with the European Union’s Digital Services Act (DSA). The pilot, which ran for a year in select EU countries, was the first large-scale test of AI-driven age verification under the DSA. It demonstrated the complexities of aligning regulatory requirements with user privacy.
TikTok’s age-verification system uses artificial intelligence and human moderation to identify and remove accounts belonging to users under 13. The process starts with an AI model that examines profile information, posted videos, and behavioural signals such as posting frequency and interaction patterns. Accounts flagged by the AI are reviewed by specialist moderators to avoid automatic bans and reduce false positives.
Subscribe to our newsletter and never miss a story. No spam, ever.
During the EU pilot, thousands of accounts were removed. However, some adults were misclassified as underage. Users who believed their accounts were wrongly flagged could appeal using facial age estimation via Yoti, credit card authorisation, or government-approved identification. TikTok stated that age-prediction data is used solely for moderation and deleted afterward, in line with the EU’s General Data Protection Regulation (GDPR).
“By adopting this approach, we are able to deliver safety for teens in a privacy-preserving manner,” TikTok stated in a blogpost. “We take our responsibility to protect our community, and teens in particular, incredibly seriously.”
The system restricts direct messaging for under-16s and limits screen time to 60 minutes for under-18s.
The EU’s approach to age verification focuses on compliance with the DSA, which requires platforms to protect minors from harmful content. TikTok’s system was developed in collaboration with Ireland’s Data Protection Commission to ensure alignment with local laws. This approach contrasts with Australia’s outright ban on social media for users under 16, which has led to the removal of 4.7 million accounts since December 2025. Critics argue that such bans may push teens toward unregulated platforms or the dark web.
TikTok’s system positions the company as an early adopter of the DSA’s age-verification mandates. Meta, which uses Yoti for age verification on Facebook, faces similar challenges in balancing compliance with privacy concerns, much like the broader challenges faced by social media platforms in the United States.
TikTok’s age-verification rollout coincides with the EU’s plans to launch its Digital Identity Wallet by the end of 2026. The wallet aims to standardise digital identity across the bloc, including age-verification functionalities. This could reduce platforms’ reliance on third-party tools like Yoti, though it introduces new challenges for interoperability and user trust.
By 2026, all EU member states must provide at least one national wallet. This will allow users to prove their age without sharing additional personal data. This shift may change how platforms like TikTok verify user ages, moving toward standardised digital credentials..

A New Mexico jury ordered Meta to pay $375 million for child safety failures. A day later, a California jury found Meta and YouTube liable for addicting a child. The legal dam that tech companies spent decades building just broke.

Coral Springs attorney Matthew Fornaro has spent 23 years advising small businesses, but his most effective legal work happens in a classroom.

As AI reshapes business operations, organisations must update their privacy notices to reflect how personal data is collected, analysed, and used by AI systems.

The consolidated IEEPA cases cover $133 billion in duties imposed under two emergency declarations. A ruling could trigger refund claims on the full amount.