In June 2024, a concerning incident happened involving a child on VRChat: A 13-year-old boy in New York threatened self-harm twice in one day. Also in June, another 13-year-old boy in Florida disclosed his personal phone number to a user he had never met in real life while using Discord. And in November 2023, a 13-year-old girl in Florida was repeatedly asked for an intimate photo by a stranger while on Discord and Roblox over the course of a week — something that she eventually sent.
These scenarios mirror threats manifesting in thousands of homes every day, where children face doxxing, sextortion and bullying via online games. Why, then, has there been such a tepid response to safeguarding children in these virtual playgrounds?
The answer is twofold: a lack of public awareness and the industry’s focus on the bottom line. Big tech firms like Microsoft, Nintendo and Sony and game developers like Riot Games, Valve, Epic and EA claim that monitoring every player interaction is impractical and instead rely on players to self-report threats. For companies like Microsoft, which reported a net income of $21.9 billion (a 20 percent increase) for the quarter ended March 31, 2024, it’s clearly not about affordability; rather, it’s a lack of taking responsibility. The safety of our children demands more than kicking the can down the road. It requires action.
And while over 30 state legislators currently have pending legislation to curb threats across social media aimed at minors — and platforms like TikTok face scrutiny for their influence on young minds — the gaming industry has largely escaped similar censure.
The industry continues to grow at a rapid clip, with tens of millions of gaming consoles sold each year and a gaming audience that is expected to increase by an additional 400 million gamers by 2029, reaching over 3 billion users globally. Industry growth underscores the dire, immediate need to balance the benefits of gaming with rigorous safety measures.
Tech firms also often cite privacy concerns as a reason for not sharing security information with parents but are more than happy to utilize user data when it benefits them. For example, Microsoft’s Copilot tool records gaming sessions for gamer improvement purposes.
Video games aren’t merely a source of entertainment; they’re also a place kids can build reasoning skills, problem solve and connect with like-minded peers. However, because the gaming industry is doing the bare minimum to protect young players, children increasingly face mental health challenges and other dangers from gameplay. So, how can we change this?
Gaming platforms and tech giants must own up. Industry leaders and gaming manufacturers must intensify their efforts to monitor and mitigate threats. Transparency is critical. While cooperation with law enforcement is a positive step, these companies seldom disclose the nature of these interactions, leaving parents unaware of the dangers their children face.
Legislative action is crucial. The legislative landscape needs to evolve to reflect the new realities of digital gaming. Just as there has been significant legislative pushback against the perils of social media, similar measures must extend to the gaming industry. It’s not just about cooperation; it’s about compulsory safety mechanisms that are transparent and effective.
We need to push for industry-wide collaboration for greater transparency. Currently, tech platforms are grading their own homework when it comes to online monitoring, security and safety. It’s time to pull back the curtain and foster interoperability within the industry to ensure online safety for kids everywhere. These efforts should be complemented by enhanced transparency, as well as helping parents understand the risks and manage them effectively.
Gaming has become a rewarding, everyday hobby for millions of kids — and yet, the industry at large has not put the systems and standards in place to make these virtual communities safe.
The stakes are too high. The leading gaming platforms must be transformed into environments where safety is integral, not treated as an afterthought. It’s time for the video game industry to be held to the same standards as social media. The government’s negligence in holding game manufacturers and developers accountable for failing to safeguard children online is alarming and perpetuates a dangerous environment where young users are vulnerable to exploitation and harm.
Only through collective effort — industry-wide, legislatively and at home — can we ensure a safer digital playground for our children.
Ron Kerbs is the Founder and CEO of Kidas, a technology company developing AI-powered text and voice communication tools to protect online users from toxicity, cyberbullying and other predatory behavior.
