Executives from social media titans TikTok, Snapchat and YouTube were on the hot bench Tuesday for a U.S. Senate hearing focused on protecting children from exposure to harmful online content.

Utah Sen. Mike Lee did a little sleuthing himself ahead of the hearing, one that marked first-time appearances before congress for China-based short video platform TikTok and U.S. messaging app Snapchat owner Snap, Inc. Lee had his staff launch a Snapchat account for a fictitious 15-year-old and then monitored what kind of content was pushed to the fake teen.

During his questioning of Snap vice president of global public policy Jennifer Stout, Lee said what his staffers discovered was appalling for a platform that purports to moderate content appropriate for users aged 13 and older.

“When they opened the Discover page on Snapchat ... they were immediately bombarded with content that I can most politely describe as wildly inappropriate for a child,” Lee said. “Including recommendations for, among other things, an invite to play an online sexualized video game that’s marketed itself to people who are 18 and up, tips on ‘why you shouldn’t go to bars alone’ and ... articles about porn stars. Let me remind you that this inappropriate content has by default been recommended for a 15-year-old child ... sent to them by an app just using the default setting.

“I respectfully but very strongly beg to differ on your characterization that the content is suitable for children 13 and up.”

Does Facebook approve ads harmful to children? What Sen. Mike Lee has to say about it
Utah Sen. Mike Lee poised to lead antitrust charge against Big Tech, which he calls ‘nightmare of censorship and hypocrisy’
Mike Lee says Apple, Google app store strategies are an abuse of market power

Citing the harm that can come to vulnerable young people from the sites — ranging from eating disorders to exposure to sexually explicit content and material promoting addictive drugs — lawmakers on the Senate Commerce Subcommittee on Consumer Protection, Product Safety and Data Security also sought the executives’ support for legislation bolstering protection of children on social media. But they received little firm commitment.

“The problem is clear: Big Tech preys on children and teens to make more money,” Sen. Edward Markey, D-Mass., said at the hearing.

The subcommittee recently took testimony from a former Facebook data scientist, who laid out internal company research showing that the company’s Instagram photo-sharing service appears to seriously harm some teens. The subcommittee is widening its focus to examine other tech platforms, with millions or billions of users, that also compete for young people’s attention and loyalty.

“We’re hearing the same stories of harm” caused by YouTube, TikTok and Snapchat, said Sen. Richard Blumenthal, D-Conn., the panel’s chairman.

“This is for Big Tech, a big tobacco moment ... It is a moment of reckoning,” he said. “There will be accountability. This time is different.”

To that end, Markey asked the three executives — Michael Beckerman, a TikTok vice president and head of public policy for the Americas; Leslie Miller, vice president for government affairs and public policy of YouTube’s owner Google; and Stout — if they would support his bipartisan legislation that would give new privacy rights to children, and ban targeted ads and video autoplay for kids.

In a lengthy exchange as Markey tried to draw out a commitment of support, the executives avoided providing a direct endorsement, insisting that their platforms already are complying with the proposed restrictions. They said they’re seeking a dialogue with lawmakers as the legislation is crafted.

That wasn’t good enough for Markey and Blumenthal, who perceived a classic Washington lobbying game in a moment of crisis for social media and the tech industry.

“This is the talk that we’ve seen again and again and again and again,” Blumenthal told them. Applauding legislative goals in a general way is “meaningless” unless backed up by specific support, he said.

“Sex and drugs are violations of our community standards; they have no place on TikTok,” Beckerman said. TikTok has tools in place, such as screen-time management, to help young people and parents moderate how long children spend on the app and what they see, he said.

The company says it focuses on age-appropriate experiences, noting that some features, such as direct messaging, are not available to younger users. The video platform, wildly popular with teens and younger children, is owned by the Chinese company ByteDance. In only five years since launching, it has gained an estimated 1 billion monthly users.

Early this year after federal regulators ordered TikTok to disclose how its practices affect children and teenagers, the platform tightened its privacy practices for users under 18.

Pressed by Sen. Amy Klobuchar, D-Minn., about a 19-year-old said to have died from counterfeit pain medication he bought through Snapchat, Stout said, “We’re absolutely determined to remove all drug dealers from Snapchat.” Stout said the platform has deployed detection measures against dealers but acknowledged that they are often evaded.

Stout made the case that Snapchat’s platform differs from the others in relying on humans, not artificial intelligence, for moderating content.

Snapchat allows people to send photos, videos and messages that are meant to quickly disappear, an enticement to its young users seeking to avoid snooping parents and teachers. Hence its “Ghostface Chillah” faceless (and word-less) white logo.

Only 10 years old, Snapchat says an eye-popping 90% of 13- to 24-year-olds in the U.S. use the service. It reported 306 million daily users in the July-to-September quarter.

Miller said YouTube has worked to provide children and families with protections and parental controls like time limits, to limit viewing to age-appropriate content.

The three platforms are woven into the fabric of young people’s lives, often influencing their dress, dance moves and diet, potentially to the point of obsession. Peer pressure to get on the apps is strong. Social media can offer entertainment and education, but platforms have been misused to harm children and promote bullying, vandalism in schools, eating disorders and manipulative marketing, lawmakers say.

The panel wants to learn how algorithms and product designs can magnify harm to children, foster addiction and intrusions of privacy. And Blumenthal especially asked the executives whether independent research had been conducted on the impact on young people of the platforms. He said the lawmakers expected to receive information from the companies on such research soon.

TikTok, in its first time testifying before Congress, received especially fierce criticism during the hearing, particularly from conservative Republican lawmakers who highlighted its Chinese ownership. The company says it stores all TikTok U.S. data in the United States, with a backup facility in Singapore.

“TikTok actually collects less data than many of our peers,” Beckerman said.

Sen. Ted Cruz, R-Texas, told Beckerman that he dodged questions more than any witness he’s ever seen in Congress.

TikTok’s privacy policy states, “We may share all of the information we collect with a parent, subsidiary or other affiliate of our corporate group.” Senators drilled down on whether “other affiliate” includes ByteDance and what that means for Chinese access to data.