OPINION:
Big Tech wants parents to believe everything is under control, but the reality is very different when it comes to the safety of children online.
Take Roblox, the gaming giant with more than 70 million daily users, most of them children. Louisiana has just sued the company for failing to protect children from predators. At the same time, Roblox banned one of the very watchdogs exposing those dangers: Schlep, a YouTuber who may now team up with Chris Hansen, the journalist who made confronting online predators a national conversation.
Think about that: Predators still prowl Roblox’s chatrooms, but the man documenting their activity is the one silenced. That’s not child safety; it’s brand management, and it’s exactly what we’ve come to expect from Big Tech.
For years, Roblox has marketed itself as a digital playground, a space for creativity and community. Yet like in any other playground, predators lurk at the edges. In-game chat features, private servers, and limited moderation create openings that bad actors exploit to groom children.
Parents have raised alarms. Watchdogs have documented abuses. Now, Louisiana Attorney General Liz Murrill has stepped in to argue that the company ignored its responsibility to protect minors.
Roblox’s response has been to deny, deflect and punish critics. Schlep’s ban is proof. His only “crime” was showing the public what predators do on the platform. By removing him, Roblox sent a clear message: Keeping whistleblowers quiet is easier than fixing a broken system.
This is a pattern across Silicon Valley. Social media platforms routinely censor speech that makes them look bad, whether it’s users calling out election interference, medical misinformation or, now, child exploitation. They move swiftly against critics, yet predators and scammers often slip through the cracks. It’s a two-tier enforcement system: swift and merciless against whistleblowers, sluggish and inconsistent against real threats.
Mr. Hansen knows this pattern all too well. Nearly two decades ago, his “To Catch a Predator” series on NBC exposed how easily children could be targeted in chatrooms. The series shocked parents into paying attention. Mr. Hansen’s new work applies that same scrutiny to Roblox, showing that even as technology evolves, the danger remains. Only now, the predators aren’t hiding just in chatrooms; they’re also hiding in games millions of children log into every day.
Big Tech’s defenders argue that these platforms are too large to police effectively, but that excuse doesn’t hold water. If Roblox can build a billion-dollar empire on the backs of children’s gameplay, it can afford better safeguards. If it can swiftly remove Schlep, it can swiftly remove predators. What’s missing isn’t capacity; it’s will.
Parents also must recognize their role. Too many treat Roblox as a digital babysitter. Unmonitored access to strangers in a chatroom is no safer on Roblox in 2025 than on AOL in 2005. Children need guidance, limits and oversight. Parents must learn to set privacy controls, supervise gameplay and teach children the dangers of sharing personal information. Digital parenting is not optional.
However, no amount of parental vigilance can replace corporate responsibility. Roblox actively markets itself to children, which means it has a duty to make safety the first priority, not the last. Stronger moderation, transparent reporting of predator activity, partnerships with watchdogs and outside scrutiny should be the baseline. Instead, the company hides behind vague “terms of service” to punish its critics.
Louisiana’s lawsuit may force accountability where corporate goodwill has failed. Courts have a way of bringing priorities into the real world. Still, the broader problem runs deeper. Big Tech has grown accustomed to silencing voices that make it uncomfortable while neglecting the very protections it promises to families. That hypocrisy is dangerous, and children are the ones paying the price.
The stakes could not be higher. Online predators ruin more than games; they also ruin lives. Roblox gambles with children’s safety whenever it chooses to protect its image instead of its users. Mr. Hansen’s original work showed the country what could happen in an AOL chatroom. His current work shows that Roblox has become no less risky. The predators haven’t gone away. The platforms just got bigger, richer and more insulated from consequences.
Parents, policymakers and platforms face a choice. Do we allow Big Tech to continue silencing critics while predators roam free, or do we demand transparency, accountability and real reform?
Louisiana has made its choice. Mr. Hansen and Schlep have made theirs. The question is whether the rest of us — parents, courts and, yes, even Congress — will finally step up.
Big Tech has proved it won’t police itself. Until it does, the responsibility falls to us to hold it accountable. When whistleblowers are banned and predators are left behind, every child online is at risk.
• Jessamyn Dodd is a multimedia journalist and former nurse with nearly five years of reporting experience. She has worked for outlets including The Epoch Times, the New York Post and Imperial Valley Press, covering crime, politics, entertainment and the U.S.-Mexico border.
Please read our comment policy before commenting.