Is Roblox a Safe Platform for Kids?

Posted on February 5, 2026 by

0


Roblox is a platform with a player base of mostly minors, but currently many users and parents are questioning Roblox’s ability to protect the children on their site.

Roblox is an online platform that is intended for all ages and promises a moderation system to protect its users. Recently, after Roblox banned a predator catcher named Schlep, many were left to question if Roblox is even trying to protect the minors on their website. What has followed is multiple US states suing Roblox for negligent moderation of predators.

The situation started when on August 8th, 2025, user ‘Schlep’ received a cease and desist from Roblox and had his account terminated for his predator-catching group because of Roblox’s policy against “vigilante groups.”

Schlep is a Roblox youtuber who works with fellow YouTuber ‘Jidion’ and law enforcement to do sting operations catching predators on the platform. Schlep was inspired to do these operations because he was a victim of grooming on Roblox when he was a minor and believes that, if Roblox isn’t going to protect child users, then he needs to.

Outrage spread throughout Roblox’s community following the ban because many agreed Roblox’s moderation was not properly protecting minors on the platform, so people were questioning why Roblox would terminate a user who is trying to solve one of their biggest issues.

Following the Schlep controversy, five US states have sued Roblox for failure to protect children on their platform. These states are Louisiana, Kentucky, Texas, Florida, and Iowa.

Liz Murrill, the Louisiana attorney general, states her reasoning on Variety News: “[Roblox] knowingly and intentionally fails to implement basic safety controls to protect child users from predators.”

A Fox 13 News article states that Florida claims Roblox is violating the Children’s Online Privacy Protection Act (COPPA) by collecting information from minors under the age of 13 without parent permission.

Starting in December 2025, Roblox started to roll out their age verification update which requires children to have an ai scan of their face to put them into different age groups. Once the account is assigned to an age group, that account can only see messages from certain age groups surrounding them. This is Roblox’s attempt at solving their lack of child safety; however, there are many criticisms.

The third party company Roblox is using for these scans is called Persona, and they claim to keep the biometric data for 30 days after a scan, including the faces of children, which presents a potential COPPA violation. 

Many users have also pointed out that a predator can buy an account that’s assigned to a younger age group through sites like eBay. From there, the predator can then choose what age group he or she wants to go into. This presents a massive problem with the age verification update, allowing for predators to find children in isolated age groups and move their conversations off-platform.

Roblox’s safety issues continue to prevail in 2026, and experts advise parents not to allow children to use the platform due to Roblox’s poor child safety protections.

Posted in: Jonathan Lado