Facebook whistleblower Frances Haugen was the sole witness at a Tuesday Senate hearing where Utah Sen. Mike Lee presented evidence he said showed Facebook is simply failing to catch harmful ad content targeting children.
Lee, R-Utah, used visual aids to show three ads that a watchdog group said were approved by Facebook for an audience of potentially over 9 million 13- to 17-year-olds. One ad, he said, celebrated recreational drug use, another promoted unhealthy eating habits and a third pushed a message that would have asked underage users if they wanted “to make a love connection.”
While the ads were never run, Lee hoped Haugen could shed some light on how the potentially harmful content could have made it past Facebook’s screening process.
“One could argue (this) proves Facebook is allowing and perhaps facilitating the targeting of harmful, adult-themed ads to our nation’s children,” Lee said. “Can you tell me how these ads, with a target audience of 13- to 17-year-old children ... could possibly be approved by Facebook?”
Haugen, a former Facebook employee who secretly copied tens of thousands of internal documents before leaving her job as a member of Facebook’s civic integrity unit and filing federal whistleblower complaints, said there is a chance an actual person never saw the images before approval was granted.
“It is very possible none of these ads were seen by a person,” Haugen said. “And the reality is that we’ve seen in repeated documents from my disclosures that ... Facebook’s artificial intelligence system only catches a very tiny minority of offending content.
“In the case of children, drug paraphernalia ads like that, if they rely on computers instead of humans ... only catch 10 to 20%.”
Lee has been an outspoken advocate on behalf of efforts to reign in the expansive market power of U.S. “Big Tech” platforms and at a hearing last month, had the opportunity to question Facebook vice president of privacy and public policy Steve Satterfield.
Lee cited a series of damning Wall Street Journal reports published last month that reviewed Facebook internal documents provided by Haugen that suggest the company is aware of flaws in its platforms that cause direct and measurable harm to its users.
Lee said the reporting revealed “shocking, absolutely stunning lapses in Facebook’s ability to protect Facebook consumers, its users, from being harmed by using its platforms” and that the failure to act on these issues is simply an outgrowth of its status as the world’s biggest social media platform.
“This too looks like the behavior of a monopolist that’s so sure its customers have nowhere to go that it displays a reckless disregard for quality assurance for its own brand image,” Lee said. “And even just being honest with its users about the obvious safety risks it’s subjecting its users too, particularly its teenage users.”
Speaking confidently at a charged hearing in front of the Senate Commerce Subcommittee on Consumer Protection, Product Safety, and Data Security, Haugen accused the company of being aware of apparent harm to some teens from Instagram and being dishonest in its public fight against hate and misinformation.
“Facebook’s products harm children, stoke division and weaken our democracy,” Haugen said. “The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people.”
“Congressional action is needed,” she said. “They won’t solve this crisis without your help.”
Haugen said that while the company openly acknowledged that integrity controls were critical for internal systems that stoke the engagement of users, it failed to fully deploy some of those tools.
In dialogue with receptive senators of both parties, Haugen, who focused on algorithm products in her work at Facebook, explained the importance to the company of algorithms that govern what shows up on users’ news feeds. She said a 2018 change to the content flow contributed to more divisiveness and ill will in a network ostensibly created to bring people closer together.
Despite the enmity that the new algorithms were feeding, she said Facebook found that they helped keep people coming back — a pattern that helped the social media giant sell more of the digital ads that generate the vast majority of its revenue.
“It has profited off spreading misinformation and disinformation and sowing hate,” said Sen. Richard Blumenthal, D-Conn., the panel’s chairman. “Facebook’s answers to Facebook’s destructive impact always seems to be more Facebook, we need more Facebook — which means more pain, and more money for Facebook.”
Haugen said she believed Facebook didn’t set out to build a destructive platform. “I have a huge amount of empathy for Facebook,” she said. “These are really hard questions, and I think they feel a little trapped and isolated.”
But “in the end, the buck stops with Mark,” Haugen said, referring to Mark Zuckerberg, who controls more than 50% of Facebook’s voting shares. “There is no one currently holding Mark accountable but himself.”
Like fellow tech giants Google, Amazon and Apple, Facebook has enjoyed minimal regulation. A number of bipartisan legislative proposals for the tech industry address data privacy, protection of young people and anti-competitive conduct. But getting new laws through a divided Congress is a heavy slog. The Federal Trade Commission has adopted a stricter stance recently toward Facebook and other companies.
The subcommittee is examining Facebook’s use of information its own researchers compiled about Instagram. Those findings could indicate potential harm for some of its young users, especially girls, although Facebook publicly downplayed possible negative impacts. For some of the teens devoted to Facebook’s popular photo-sharing platform, the peer pressure generated by the visually focused Instagram led to mental health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research leaked by Haugen showed.
One internal study cited 13.5% of teen girls saying Instagram makes thoughts of suicide worse and 17% of teen girls saying it makes eating disorders worse.
Haugen said that Facebook prematurely turned off safeguards designed to thwart misinformation and incitement to violence after Joe Biden defeated Donald Trump last year, alleging that doing so contributed to the deadly Jan. 6 assault on the U.S. Capitol.
After the November election, Facebook dissolved the civic integrity unit where Haugen had been working. That was the moment, she said, when she realized that “I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.”
Haugen says she told Facebook executives when they recruited her that she wanted to work in an area of the company that fights misinformation, because she had lost a friend to online conspiracy theories.
Facebook maintains that Haugen’s allegations are misleading and insists there is no evidence to support the premise that it is the primary cause of social polarization.
“Even with the most sophisticated technology, which I believe we deploy, even with the tens of thousands of people that we employ to try and maintain safety and integrity on our platform, we’re never going to be absolutely on top of this 100% of the time,” Nick Clegg, Facebook’s vice president of policy and public affairs, said Sunday on CNN’s “Reliable Sources.”
That’s because of the “instantaneous and spontaneous form of communication” on Facebook, Clegg said, adding, “I think we do more than any reasonable person can expect to.”