TikTok is set to enhance its age-verification technology across the European Union in the coming weeks. This move comes amid increasing calls for stricter regulations on social media use by minors, particularly in the UK, where discussions about potential bans for under-16s are gaining traction. The platform, owned by ByteDance, faces mounting pressure to effectively identify and remove accounts belonging to children.
The new age-verification system has been in a trial phase within the EU for the past year. It utilizes profile data, videos, and behavioral indicators to assess whether a user may be under the age of 13. Accounts flagged by the technology will undergo review by specialized moderators rather than being automatically banned, allowing for a more nuanced approach. In a previous UK pilot, this initiative resulted in the removal of thousands of accounts.
In December 2023, Australia implemented a social media ban on users under 16 years old. Since the ban’s initiation on December 10, 2023, the Australian eSafety Commissioner reported the removal of over 4.7 million accounts across ten platforms, including TikTok, YouTube, Instagram, Snapchat, and Facebook.
As European authorities examine how platforms verify user ages, TikTok’s rollout aligns with **EU** regulatory requirements. The company has collaborated with Ireland’s Data Protection Commission, its primary privacy regulator in the region, to develop this system.
Concerns regarding children’s screen time have prompted discussions among policymakers. Recently, UK Labour leader Keir Starmer expressed openness to a potential ban on social media for young users, noting alarming reports of young children spending excessive amounts of time on devices. Previously, Starmer had resisted such measures, believing they would be challenging to enforce and could lead teenagers toward riskier online environments.
Advocacy for parental rights in managing children’s online activities has also gained momentum. Ellen Roome, whose 14-year-old son tragically died following an online challenge, has called for enhanced rights for parents to access their deceased children’s social media accounts.
In addition to the UK, the European Parliament is pushing for age restrictions on social media platforms, while Denmark is advocating for a ban on access for individuals under the age of 15.
In 2023, a Guardian investigation revealed that moderators on platforms were instructed to allow users under 13 to remain active if they claimed parental oversight. This finding underscores the need for robust verification processes, which TikTok aims to address with its new technology.
As the landscape of online safety continues to evolve, TikTok’s proactive approach to age verification signals a significant step in addressing concerns about the protection of minors in digital spaces.