Utah Sen. Mike Lee has been preparing for recent Senate hearings with U.S. social media platforms by testing out their claims of safeguards for kids by setting up spoof accounts for underage users and then watching what kind of content rolls in.
In October, Snapchat got snared in the exercise, and on Wednesday, Instagram met the same fate during a hearing held by the U.S. Senate Subcommittee on Consumer Protection, Product Safety, and Data Security.
Instagram head Adam Mosseri, the only witness at the hearing, was called out when a fake account opened by Lee staffers for a fictitious 13-year-old girl also led to content Lee called “unsuitable for children.”
Lee said the account initially only attracted content suggestions that seemed appropriate for a 13-year-old until the fictional user responded to a prompt from Instagram to follow the account of a “very famous female celebrity.” Then, Lee said, things changed dramatically.
“Why did following Instagram’s top recommended account for a 13-year-old girl cause our Explore page to go from showing relatively innocuous things like hairstyling videos to content promoting body dysmorphia, the sexualization of women and content otherwise unsuitable for a 13 year old girl?” Lee asked Mosseri. “What happened?”
What Instagram is — and is not — doing to protect users under the age of 18 on its platform was the focus of the 2 1/2-hour hearing, another in a string of inquiries by U.S. lawmakers instigated by the release of a trove of internal documents earlier this year by a former employee of Instagram’s owner, Facebook.
On Tuesday, Instagram introduced a previously announced feature that urges teenagers to take breaks from the platform. The company also announced other tools, including parental controls, due to come out early next year that it says are aimed at protecting young users from harmful content.
But committee chairman Sen. Richard Blumenthal, D-Conn., told Mosseri the parental oversight tools “could have been announced years ago.” He said the newly announced measures fall short and many of them are still being tested.
A pause that Instagram imposed in September on its work on a kids’ version of the platform “looks more like a public relations tactic brought on by our hearings,” Blumenthal said.
“I believe that the time for self-policing and self-regulation is over,” said Blumenthal. “Self-policing depends on trust. Trust is over.”
Mosseri testified as Facebook, whose parent now is named Meta Platforms, has been roiled by public and political outrage over the disclosures by former Facebook employee Frances Haugen. She has made the case before lawmakers in the U.S., Britain and Europe that Facebook’s systems amplify online hate and extremism and that the company elevates profits over the safety of users.
Haugen, a data scientist who had worked in Facebook’s civic integrity unit, buttressed her assertions with a trove of internal company documents she secretly copied and provided to federal securities regulators and Congress.
The Senate panel has examined Facebook’s use of information from its own researchers that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts. For some Instagram-devoted teens, peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research detailed in the Facebook documents showed.
The revelations in a report by The Wall Street Journal, based on the documents leaked by Haugen, set off a wave of recriminations from lawmakers, critics of Big Tech, child-development experts and parents.
“As head of Instagram, I am especially focused on the safety of the youngest people who use our services,” Mosseri testified. “This work includes keeping underage users off our platform, designing age-appropriate experiences for people ages 13 to 18, and building parental controls. Instagram is built for people 13 and older. If a child is under the age of 13, they are not permitted on Instagram.”
The scathing September report by the Wall Street Journal reviewed Facebook internal research findings that reflected serious harms, particularly for teen girls, stemming from time spent on Instagram.
“Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” the researchers said in a March 2020 slide presentation posted to Facebook’s internal message board, reviewed by The Wall Street Journal. “Comparisons on Instagram can change how young women view and describe themselves.”
Journal reporting found that, for the past three years, Facebook has been conducting studies into how its photo-sharing app affects its millions of young users. Repeatedly, the company’s researchers found that Instagram is harmful for a sizable percentage of them, most notably teenage girls.
“We make body image issues worse for 1 in 3 teen girls,” said one slide from 2019, summarizing research about teen girls who experience the issues.
“Teens blame Instagram for increases in the rate of anxiety and depression,” said another slide. “This reaction was unprompted and consistent across all groups.”
Among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the desire to kill themselves to Instagram, one presentation showed.
Utah’s senior senator said Instagram’s failure to protect children was akin to another big U.S. industry that ran into massive legal liabilities when it was widely exposed that its product was dangerous, and sometimes lethal, to its users.
“You’re the new tobacco, whether you like it or not,” Lee told Mosseri. “And you’ve got to stop selling the ‘tobacco’ to kids. Don’t let them have it. Don’t give it to them.”
Mosseri outlined the suite of measures he said Instagram has taken to protect young people on the platform. They include keeping kids under 13 off it, restricting direct messaging between kids and adults, and prohibiting posts that encourage suicide and self-harm.
But, as researchers both internal and external to Meta have documented, the reality is different. Kids under 13 often sign up for Instagram with or without their parents’ knowledge by lying about their age. And posts about suicide and self-harm still reach children and teens, sometimes with disastrous effects.
Senators noted there are multiple federal legislative proposals currently in process, and drawing bipartisan support, aiming to bolster protections for underage users of social media and other digital platforms. The proposals include making changes to critical internet business protections offered under Section 230 of the Communications Decency Act, contingent on robust protection policies for kids.
Contributing: Associated Press