When Frances Haugen, a former product manager at Meta/Facebook, testified before a Senate committee about the dangers of algorithmically enhanced engagement, many people paid attention, but lawmakers did not act.
Now that, for the first time, social media has been formally judged a partial cause of a suicide, they may finally be compelled to do so.
Molly Russell was only 14 when she hanged herself in the U.K. in November 2017. Almost five years later, the official coroner’s inquest found that “Molly Rose Russell died from an act of self-harm whilst suffering from depression and the negative effects of online content.”
To understand why social media companies may be at least partially culpable in Molly’s death, it’s helpful to revisit Haugen’s testimony from one year ago.
Haugen explained then that the business model of social media companies is based on engagement, which is measured in the amount of time a person spends consuming and reacting to content on the sites.
The greater the engagement, the more information the site can extract and sell about you. These companies look for ways to keep you on their site longer; they must keep your interest high.
And so they have developed algorithms that incorporate machine-learning to figure out what gets you excited and engaged as long as possible. The key to achieving this is to elicit an emotional reaction, not a cerebral one. The strongest emotions — outrage, titillation, envy, despair — are the best bets.
No wonder, then, as Haugen notes, “The result has been a system that amplifies division, extremism and polarization — and undermining societies around the world. In some cases, this dangerous online talk has led to actual violence that harms and even kills people. In other cases, their profit-optimizing machine is generating self-harm and self-hate — especially for vulnerable groups, like teenage girls.”
The details of the Russell case are tragic in the extreme. Russell fell into a depressive stage when she reached her teen years, and while searching online, she began to engage with posts about depression. As soon as Meta’s algorithms noticed this, the company began to feed her more and more extreme content on depression, self-harm and suicide — material she had not requested.
In fact, Haugen, commenting on the case, stated that the algorithms likely showed Russell harmful content before she had even searched for it. If Molly had even typed something innocuous about being blue or feeling sad, the algorithms would have pushed on her material that was much further down the rabbit hole. The algorithms were designed to induce binge-watching, and they were successful. In the last six months of her life, Russell was engaging with an average of 130 posts a day.
The child psychiatrist giving testimony at the inquest, Dr. Navin Venugopal, watched the material Russell was fed. He stated, “I will talk about the effect the material had on my state of mind. I had to see it over a short period of time, and it was very disturbing, distressing. There were periods where I was not able to sleep well for a few weeks so bearing in mind that the child saw this over a period of months I can only say that she was (affected) — especially bearing in mind that she was a depressed 14-year-old. It would certainly affect her and made her feel more hopeless.”
The Russell family’s lawyer had to seek therapy after viewing some of the content Russell had viewed, commenting, “It keeps sucking you deeper, I could feel it happening to myself and I’m a resilient adult.”
Astoundingly, the witness from Meta, an executive with the company, declared at the inquest that they found the videos “safe.” Asked by the lawyer whether it was obvious that graphic suicide imagery was patently not safe for children, the same executive stated, “I don’t know ... these are complicated issues.” To add insult to injury, it took years of fighting for the Russell family to even receive information from Meta about what their daughter had been viewing. The company stalled until August of this year, less than a month before the inquest started.
Now that the inquest has found several specific social media sites partially responsible for Russell’s death, the next step is to hold them liable in a criminal court. Peter Wanless, chief executive of the National Society for the Prevention of Cruelty to Children, said, “The ruling should send shockwaves through Silicon Valley — tech companies must expect to be held to account when they put the safety of children second to commercial decisions. The magnitude of this moment for children everywhere cannot be understated.”
The U.K. is proposing a new Online Safety Bill, which would for the first time compel platforms to protect users from online harm, particularly children, by requiring them to take down illegal and other harmful content. Companies would have to spell out how they plan to prevent harmful material from being seen by their users. The bill would also force social media companies to reveal their algorithms to regulators. Those companies in defiance of these rules would face fines or even be blocked from the U.K. entirely.
We in the U.S. are not quite as far down the road as the U.K., but that may change. There is a proposed Kids Online Safety Act, which would do many of the same things as the U.K. bill. But there’s currently no enforcement body to ensure compliance, as there exists in the U.K. Colorado Sen. Michael Bennet has recently proposed legislation that would create a new Federal Digital Platform Commission that would have the power to police social media sites.
But there may not be much action in the U.S. until the success that Russell’s family has had in holding social media sites accountable in official judicial proceedings has occurred here. That may well happen, however. This past July, two Kentucky mothers filed separate lawsuits against Meta, claiming the company’s algorithms pushed their daughters into eating disorders which caused both of them to try and take their own life. In fact, there are more than a dozen lawsuits pending against Meta for this type of harm, because internal documents leaked to the press show Meta knew and admitted “We make body image issues worse for one in three teen girls. Teens blame Instagram for increases in the rate of anxiety and depression. This reaction was unprompted and consistent across all groups.”
While the wheels of justice may grind slowly, they are grinding away. The reckoning for social media companies is coming. Self-regulation is clearly not enough. As Andy Burrows, head of child safety online policy at the NSPCC, has said, “This is social media’s big tobacco moment. For the first time globally, it has been ruled content a child was allowed and even encouraged to see by tech companies contributed to their death.”
Molly Russell. Remember her name.
Valerie M. Hudson is a university distinguished professor at the Bush School of Government and Public Service at Texas A&M University and a Deseret News contributor. Her views are her own.