In recent Senate testimony, an anonymous Texas mother, “Jane Doe,” lamented how her 15-year-old son’s mental and physical health deteriorated after using Character.AI, an app marketed in Apple’s App Store as safe for children 12 and older. In a matter of months, the boy became paranoid, violent and homicidal. He lost 20 pounds after refusing to eat and even cut himself in front of his siblings. He is now in a residential treatment facility.
When Jane sued, Character.AI claimed her son was bound by the app’s terms of service he had “signed” as a minor, limiting its liability to $100. The case was forced into arbitration, where the company then refused to participate, leaving the family without recourse as their child remained in residential treatment.
No child should ever be asked to sign a contract that waives their rights or shields billion-dollar corporations from responsibility. No child should be bound by agreements that allow companies to harvest their personal data or exempt themselves from harm without a parent’s knowledge or consent. And no developer should be able to misrepresent the safety of an app used by a child.
Thanks to a new law set to go into effect early next year, kids in three states will finally be protected from these harms. The App Store Accountability Act, adopted in Texas, Utah and Louisiana, requires app stores to verify users’ ages, obtain parental consent before minors download or purchase apps, and ensure that any age ratings are accurate and clearly displayed.
It seems app stores are set to comply. Earlier this month, Apple and Google notified millions of developers about new tools and requirements to help them meet the new law’s standards. The new requirements mark the first time Apple and Google have formally acknowledged their legal responsibility for the safety and integrity of the marketplaces they control. It represents a landmark shift for an industry that has long operated without meaningful accountability.
The changes reflect years of work by lawmakers, parents and advocates determined to protect children online. They also confirm what experts have said for decades: The greatest barrier to child safety on digital platforms has never been technology or cost. It has been the absence of legal liability. For years, technology companies have built systems to collect and monetize personal data while claiming they lacked the ability to verify a user’s age or confirm parental consent.
That excuse belongs to another era. In 2004, when the Supreme Court struck down the Children’s Online Protection Act in Ashcroft v. ACLU, it accepted the industry’s argument that age verification was too burdensome and unreliable. The ruling effectively handed corporations a free pass, shifting the burden of protection from the companies to the parents, who were left to install filtering software that rarely worked.
But that was before smartphones, app stores and behavioral analytics transformed the digital economy. Today, verifying and identifying users is routine. By 2013, Facebook could infer a user’s age from behavior alone, and Apple and Google had long verified identities through payment systems and Family Sharing tools.
In 2014, the Federal Trade Commission issued consent decrees requiring Apple and Google to obtain express, informed consent before charging minors for in-app purchases. That order, now more than a decade old, confirms that the technology to verify age and secure parental permission has long been available. The FTC rightly recognized that without such safeguards it is impossible to draw a meaningful line between legitimate commerce and exploitation.
The App Store Accountability Act follows the same principle, requiring platforms to ensure that minors cannot make purchases without verified parental consent. The law does not restrict speech or dictate content. It simply requires app stores, acting as direct retailers, to provide accurate product information and obtain verified parental consent when contracting with minors. These are ordinary consumer protection principles that already apply in every other marketplace.
The fact that Apple and Google are preparing these systems proves that compliance is not only possible but also long overdue. As the Digital Childhood Institute’s recent FTC complaints against Apple and Google reveal, both app stores have long peddled deceptive age ratings, false safety claims and broken parental controls. Many apps rated safe for children were found to collect exact location data, contacts, photos and browsing activity without parental consent. Even more alarming, many apps require children to agree to legal terms that indemnify Apple, Google and app developers against any harm resulting from their use.
Protecting children online doesn’t require new technology. It only requires the will to use existing tools responsibly. For years, major platforms claimed that safety was impossible even as they built systems capable of predicting and monetizing every user’s behavior.
Real protection for children begins when accountability is written into law. With the App Store Accountability Act, that day has finally come.
