Tech company Apple is preparing to roll out new software that will search American’s iPhones for photos of child sexual abuse, The Associated Press reported. Apple announced the software as part new updates it is rolling out aimed at protecting children.
The announcement was praised by child safety organizations but has called into question privacy concerns of a tech company with the power to scan user’s phones or computers.
The technology, called “NeuralHash, will compare an Apple user’s images with a known database of Child Sexual Abuse Material (CSAM) — provided by the National Center for Missing and Exploited Children and other child safety organizations — before the user’s images are uploaded to the iCloud, Apple said, according to a technical summary of the software.
- “If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified,” the AP reported.
- The updated software for iPhones, Macs and Apple Watches will roll out later this year, according to the AP.
John Clark, NCMEC president and chief executive, applauded Apple software update and child protection expansions of its devices.
- “With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material,” Clark said in Tweet Thursday.
“With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material."— NCMEC (@MissingKids) August 5, 2021
But what about privacy?
Another feature Apple announced will blur out and warn underage iMessage users that they are about to receive or send a photo that is sexually explicit, The Wall Street Journal reported. Apple said the update software can warn the children that their parents will receive a message if they choose to view the flagged content.
Cryptography specialist and Johns Hopkins University computer science professor Matthew Green said governments could be interested in using the technology to conduct surveillance of iPhones, something that Apple has generally resisted, The Wall Street Journal reported.
- “Now Apple has demonstrated that they can build a surveillance system, for very specific purposes, that works with iMessage,” said Green. “I wonder how long they’ll be able to hold out from the Chinese government?”
Edward Snowden, a former CIA contractor who leaked evidence of a massive government surveillance program, said Apple had turned its devices into “iNarcs” — a mashup of iPhone and undercover narcotics agents.
- “No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow,” Snowden said on Twitter.
No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.— Edward Snowden (@Snowden) August 6, 2021
They turned a trillion dollars of devices into iNarcs—*without asking.* https://t.co/wIMWijIjJk
Apple says the software will protect kids and privacy
In the Apple summary of NeuralHash, the iPhone company said the “process is secure, and is expressly designed to preserve user privacy.” The tech company added that users will be unable to access the CSAM database and that there is a very low risk mistakenly flagging an account.
- “This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM,” Apple said in a statement “ And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account.”