After years of applied pressure and even begging from parents, advocates and lawmakers, Apple has suddenly decided to fix failures in its child safety features. Why now?

Simple. Utah and other states are moving to enact legislation that requires age verification and parental consent for all app downloads and purchases. What we call the App Store Accountability Act appears poised to become law in the Beehive State and will likely follow in others, and Apple is paying close attention. The company has released a set of reforms it headlined “Helping Protect Kids Online,” and its strategy of “shock and awe” seems to be working.

As word of these updates tears through state capitals across the country, lawmakers are rightly wondering, do these updates answer the issues that our legislation identifies and seeks to fix?

After getting the cold shoulder from Apple for years, we admit that we are pleased by some of these updates. But given what’s at stake, they are not enough. Here’s why.

Apple’s eight-page announcement outlines a number of updates to be rolled out by the year’s end. Promised features include making it easier to set up child accounts, a new “age range” application programming interface (API) that allows app stores and app developers to share age category data to better ensure age-appropriate experiences, more granular app age ratings and better app descriptions and removing apps that exceed a child’s age range from the app store.

The “age range API” seems particularly well-done. According to Tech Crunch:

“Instead of asking kids to input their birthdays, as many social apps do today, developers will have the option to use a new Declared Age Range API that allows them to access the age range information the parent input during the child account setup. (Parents can also correct this information at any time if it was originally entered incorrectly).”

These updates are needed. But why in the world has it taken so long?

For years, shareholders and child-safety advocates have been asking the company to better protect kids online. In 2018, almost 11 years after the iPhone’s release, Apple shareholders wrote a letter to the board of directors demanding that the company give parents more resources and tools to protect children. A 2022 Canadian Centre for Child Protection report detailed Apple’s failure to enforce app age ratings for younger users and labeled its parental controls as “inadequate” and “unusable.” A Screen Time parental control bug causing protective settings to disengage on child accounts has plagued iPhones and iPads for more than two years, unsolved. And a scathing Wall Street Journal investigation published in December exposed Apple for misrepresenting up to a quarter of its apps as “safe for kids” when many were rife with sexual exploitation and bullying.

Between December and now, Apple’s technological capabilities did not change. To put it positively: Apple already possessed the technological and financial capacity to institute these changes. Modern APIs have been around for years. And Apple has every financial resource it needs to develop and effectively implement these safeguards, raking in $26 billion in revenue from its app store alone during FY 2023. These are good changes, but from a moral, financial and technical standpoint, there is no reason why it should have taken this long.

But are Apple’s new features enough? No, deep issues remain unaddressed.

A few years ago, the young son of one of the authors of this column was served ads for sexual role play apps — including a graphic strip show — and ads for apps that focus on gambling and marijuana cultivation, all while playing a cartoon game that the app store rated safe for children. The mother had only stepped away for a few minutes to fold laundry, assuming that the age rating accurately represented what her son would experience while using the app.

Such anecdotes are not the exception; they are the rule. There is a systemic failure that leaves parents and children who access the internet through the app store totally vulnerable, because Apple allows developers to operate on an honor system, without any meaningful enforcement. Apple’s recent safety update doesn’t change this; its app store will still rely on the honor system, allowing developers to self-report content with little oversight and few consequences for misrepresenting age ratings or content warnings.

As mentioned, Apple’s announcement comes just days before Utah is expected to become the first state to pass the App Store Accountability Act. The bill would require Apple, and other app store providers like Google, to perform age verification and get parental consent before minors can download apps, purchase apps or make in-app purchases. Using the account holders’ age-category status and a parental-consent status collected by the app store, app developers would be required to enforce any developer-created age-related restrictions. Additionally, the legislation would require all minors to link to a parent’s account before using the app store in the first place.

Apple’s proposed changes will not solve the core issue, which is that minors need parental consent when it comes to entering binding terms of service agreements. Currently, the app store routinely allows known minors to download apps, accept complex terms of service, and make in-app purchases without any parental consent. This loophole exposes children to privacy risks, financial harm and dangerous digital environments.

Related
Perspective: App stores are exploiting our children. We must change that
Opinion: Parents can't be involved in a child's app download? That's a legal stretch

Only in app stores do we allow minors to enter into terms of service agreements with trillion-dollar companies that determine how their personal data can be used, often giving such companies full access to extremely sensitive information like photographs and videos of children, their exact location and their contact lists. Our legislative model ends that practice.

View Comments

Apple’s new updates, by contrast, will not stop any of this. Apple will still treat teens as digital adults, allowing minors to agree to complex terms of service contracts without parental consent, despite the fact that in the real world, one has to be 18 to enter into a binding contract.

Furthermore, Apple will only enforce the proposed app store protections for parents who figure out how to enable content controls, leaving parents without any meaningful backstop when their kids try to circumvent the controls. (And they will try.)

The stakes are high. If Apple can convince lawmakers that its updates are adequate to the task, then it will continue to prioritize its profit over protecting kids online without real consequences. We welcome better labels and controls, but parents and their kids need much more than that. They need a bill that provides app store accountability. And they need it now.

Melissa McKay is the chairman of the Digital Childhood Alliance. Chris McKenna is the Founder and CEO of Protect Young Eyes. Michael Toscano is the executive director of the Institute for Family Studies and director of the Family First Technology Initiative. Jared Hayden is Policy Analyst, Family First Technology Initiative, the Institute for Family Studies.

Join the Conversation
Looking for comments?
Find comments in their new home! Click the buttons at the top or within the article to view them — or use the button below for quick access.