Roblox Corporation (RBLX) announced Tuesday it's fundamentally changing how communication works on its platform, introducing age-based chat controls designed to keep adult conversations away from its massive youth audience. The changes represent one of the more aggressive safety overhauls in the gaming industry, though they also raise questions about privacy and user experience.
December Rollout, Then Worldwide Expansion
The company has opened its voluntary age estimation system to users everywhere, but the requirements become mandatory soon. Starting early December, Roblox will enforce age verification in Australia, New Zealand and the Netherlands. By early January, enforcement expands to all other regions where chat is available, making age checks essentially non-negotiable for anyone who wants to communicate on the platform.
Face Scans Sort Users Into Age Buckets
Here's how it works: users grant camera access and follow on-screen instructions, including turning their face left and right. The facial age estimation technology then places them into one of several age bands: under-9, 9–12, 13–15, 16–17, 18–20, or 21 and older. Your chat permissions depend entirely on which bucket you land in.
It's a significant shift for a platform that's built its success on open communication and user-generated content. The goal is to create age-appropriate interactions, but it also means fundamentally changing how millions of users experience the service.
Extra Restrictions for the Youngest Players
Kids under nine face the strictest controls. In-experience chat will default to "off" unless a parent completes their own age check and manually enables it. Children under 13 will also face limits on chat outside gameplay areas. Parents maintain access to their children's accounts and can update birthdates using Parental Controls after completing the age scan.
The restrictions reflect the reality that Roblox has faced intense scrutiny over child safety. The company is clearly trying to get ahead of regulatory pressure and parental concerns.
Privacy Trade-Offs and Industry Support
Chief Safety Officer Matt Kaufman said the company wants every user to have "a safe, positive, age-appropriate experience." Stephen Balkam, CEO of the Family Online Safety Institute, called age estimation a proactive step. Jules Polonetsky, CEO of the Future of Privacy Forum, said the system strengthens protections without compromising user rights.
Roblox emphasized that its vendor, Persona, deletes facial images and videos immediately after age analysis. That's important because facial recognition technology raises obvious privacy concerns, especially when kids are involved. The age checks remain technically optional, but chat features will be locked unless users verify, which makes "optional" somewhat aspirational.
Part of a Broader Safety Push
The facial age checks don't exist in isolation. Roblox says it has launched more than 145 safety initiatives since January 2025, including real-time AI monitoring, limits on mature content, additional restrictions for younger users, and stronger parental controls. The company also unveiled a new Safety Center offering guidance for parents and caregivers.
Whether all this is enough to satisfy regulators, parents and critics remains to be seen. But it's clear Roblox is betting that aggressive age verification is the path forward for a platform where the vast majority of users are children.