Nearly a month after the state of Utah filed a lawsuit against social media company Snap Inc., statistics regarding the app and children who use it were publicly released.
The data, used in the lawsuit and unredacted on Tuesday, reveals a “disturbing extent to which Snapchat’s alleged practices directly harm Utah’s children,” according to a press release by the Utah Department of Commerce’s Division of Consumer Protection, which brought the lawsuit.
Last month, Utah Attorney General Derek Brown, whose office is representing Utah in the case, said this is the most consequential lawsuit out of the state’s four ongoing social media cases, because it affects children the most.
“We will do everything we can using the legal system to incentivize and encourage companies to take steps to protect kids,” Brown previously said. “And parents need to be very mindful of what’s taking place on social media, because a lot of the drug dealing, the extortion, the sexting, and a lot of the really problematic things that are taking place right now with our kids is focused not just on social media, but on Snapchat.”
His office wouldn’t comment on the unredacted data due to it pertaining to active litigation.
Here is the unredacted data the press release highlights as damaging to Utah kids:
- Teenagers in Utah have spent nearly 8 billion minutes on the photo/video app since 2020. More than 500,000 Utah users are using Snapchat between 10 PM and 5 AM.
- Internal Snapchat senior engineering managers labeled the app’s AI tool as “reckless” due to a lack of proper testing. Employees even cautioned that it “hallucinates answers” and “can be tricked into saying just about anything.”
- The AI uses user location even when “ghost mode” is on, a fact not publicly disclosed. It also shares private conversations users have with the AI tool to Microsoft Advertising and OpenAI.
- The app “internally admitted being ‘overrun’ with sexual extortion and that it ‘takes under a minute to use Snapchat to be in a position to purchase illegal and harmful substances.’”
- Although it is called a “critical safety tool,” Snap’s in-app reporting feature had notable shortcomings, with more than 96% of reports on accounts not being reviewed by the Trust and Safety Team. One Snapchat account was reported 75 times for mentioning “nudes, minors, and extortion” and stayed active for 10 months.
“If I’m the head of this company, and I understand how much my product is harming kids and how unsafe it is, why would I keep doing this?” Margaret Busse, the executive director of Utah’s Department of Commerce, previously told the Deseret News.
“This is a choice companies make. It is not inevitable,” she said. “They could design a product with a very different business model, with very different features, that doesn’t have to be exploitative of our kids.”