Before the internet’s rise in the 1990s and the popularization of smartphones in 2010s, pornography was relegated to VCRs, magazines and brick-and-mortar buildings. Now pornography websites bring in billions of users and it’s available in people’s pockets at virtually any time.

As access has changed, so much of the country’s views on it have also shifted.

Gallup polling data from 2023 shows that 39% of respondents said pornography is morally acceptable. That’s up from 30% in 2011 when Gallup first started asking the question. Other survey data shows 37% of baby boomers (people born from 1946 to 1964) think pornography is “very bad” for society. By comparison, 14% of young adults hold that same view. 51% of respondents said they disagreed with the idea that pornography was morally wrong.

Viewership has also increased. Using data from the General Social Survey, a group of economists, a policy analyst and a sociologist evaluated pornography consumption of 18 to 26 years-old from 1973 to 2012. 45% of young men said they viewed pornography between 1973-1980 and that increased to 61% between 1999 and 2012. These researchers pointed toward the spread of the internet as potential driver of the increase.

This hasn’t been without consequences. 15 other states have followed Utah in passing resolutions to declare pornography a public health crisis. Psychology and health communication researchers have sounded the alarm, pointing toward negative impacts of viewership like increased physical aggression and a distorted perception of intimacy.

It’s an issue Sen. Mike Lee has taken on in Washington, D.C., as he tries to get legislation passed aimed at protecting children from seeing pornography and giving victim-survivors paths to legal recourse.

“We’ve got far more young people today, having access to essentially unlimited supply of pornographic material,” Lee said in an interview with the Deseret News. A 2022 report found that 73% of teens between the ages of 13 and 17 have consumed pornography, defined as “any videos or photos viewed on websites, social media, or anywhere else on the internet that show nudity and sexual acts intended to entertain and sexually arouse the viewer.”

Doctors wrote in a 2023 report that “Increased access to the internet and other online platforms has resulted in a rapid increase in the number of adolescents who encounter and consume pornographic content.”

“Adolescent pornography use has continuously increased over time and the age of first exposure to sexually explicit materials has also been getting younger. The estimates of prevalence rates have varied, but nationally representative surveys of adolescents in the USA have found that 68.4% reported exposure to online pornography,” they wrote in the report.

“Over the last roughly 30 years, Congress has tried to do something about this to protect young people from the dangers associated with having free access over the internet to porn,” Lee said. “But a lot of those laws that have been passed by Congress have ultimately been invalidated by the Supreme Court under one theory or another.”

The Communications Decency Act (title V of the Telecommunications Act of 1996) is an example. Congress passed this law in 1996 and it criminalized the transmission of obscene material to children. The Supreme Court struck down the law in 1997 in a case known as Reno v. ACLU. The majority opinion authored by Justice John Paul Stevens determined the law was written too broadly to pass constitutional muster while also recognizing “the governmental interest in protecting children from harmful materials.”

Lee said the SCREEN Act, which he reintroduced this winter, found “the sweet spot” and could pass constitutional muster. “I think this one is going to survive if it becomes law, which it should,” he said, explaining the law focuses on “age verification with regard to commercial purveyors of pornography.”

Related
Inside Utah’s war on social media

The bill requires commercial websites distributing pornography to use technology to verify the ages of its users. Children would be blocked from seeing content defined as harmful in the bill including images and videos “taken as a whole and with respect to minors, appeals to the prurient interest in nudity, sex, or excretion.” Lee compared it to the sale of cigarettes. “Just as you can’t sell cigarettes to a minor and give them to a minor, you also shouldn’t directly market cigarettes to a minor,” he said.

When asked about the constitutionality of age verification measures, Lee pointed toward the commercial element involved.

“Remember that under the doctrines applicable in this area, under what’s known as the commercial speech doctrine, it gives more flexibility for government regulation, particularly for something like this, where there’s a legitimate, widely accepted public policy at issue,” Lee said. “They need to protect children who are uniquely vulnerable to the harms associated with pornography addiction.”

Bipartisan support for legislation like the SCREEN Act is “building gradually,” according to Lee. “This is one of those things that is so reasonable, it’s really hard for people to come out openly and oppose it.”

But at the same time, it’s not the sort of issue “that gets people excited,” Lee said.

A bevy of headlines did break out earlier this year when Taylor Swift became a victim of pornography generated by artificial intelligence. As she expressed her outrage over the fake images circulating across the web, Lee said he was reintroducing the PROTECT Act. Websites that contain adult material would have to verify the ages and consent of the people depicted and quickly take down videos upon notice that consent was not given, if the bill passes.

The rapid advancement of technology has exacerbated already existing issues with deepfake and AI pornography. “Our laws are already 10 steps behind and tomorrow the new thing will make them 12 steps behind,” Chris McKenna of Protect Young Eyes told the Deseret News in a phone interview.

“One of the reasons I feel so strongly about this is that victims had to wage an uphill battle for a really long time,” Lee said. “It’s a horrible thing to have been affected by this kind of thing to begin with, but it gets so much worse when they’ve then got to deal with the fact that their image may remain out there on websites for years, sometimes decades, into the future.”

Victim-survivors of image-based sexual abuse, including deepfake pornography, have described the experience of finding their images online time and time again as “retraumatizing,” Lee said. Survivors have met with him and talked about the bill, including Katelynn Spencer who said, “there are no laws in my state to protect survivors of imaged-based sexual abuse.”

Tori Rousay, corporate advocacy program manager and analyst at the National Center on Sexual Exploitation, spoke with survivors as part of her Harvard thesis. One survivor named Ella said, “Within a split second of undertaking a reverse Google image search, my laptop screen was plastered with dozens of links to images of me on numerous pornographic sites across multiple pages of search results. My stomach sank to the floor. My heart was pounding out of my chest. My eyes widening in shock and disbelief. I was completely horrified.”

“From what many of these victims will express, it’s as though they’re being re-victimized over and over again,” Lee said, adding that the Department of Homeland Security, among other entities, have conducted investigations involving documented cases with victims of unauthorized images. “And in some cases, the predators have even targeted the children of survivors many years later. That’s why we really need tech companies to do more to prevent this sort of exploitation,” Lee said.

It’s why a couple of weeks ago, Lee asked a specific question at a Senate Judiciary Committee hearing.

Senators asked social media executives from X, Snapchat, TikTok, Meta and Discord questions about their efforts to protect children from the various harms that may come from social media. When it was his turn to speak, Lee said he wanted “to get answers.”

Lee asked Meta CEO Mark Zuckerberg about why the company did not take more measures to restrict explicit content. When Zuckerberg said “It’s my understanding that we don’t allow sexually explicit content for people of any age,” Lee asked “and how is that going?”

Related
The difference 1 word in a Utah law could make for victims of AI generated pornography

What’s really motivating Lee is that he thinks “we need to hold these tech companies and their CEOs accountable.” He sees the PROTECT Act as putting the “onus on websites to protect their own users” and bringing hope to victims.

In some states which have passed laws restricting children’s access to pornography, there’s been a discernible impact on the industry. After Utah passed its law doing just that, SB287 sponsored by Sen. Todd Weiler, R-Woods Cross and Rep. Susan Pulsipher, R-South Jordan, Pornhub blocked access to all users who attempt to access the site.

Lee said there are still some lingering issues that make it difficult to have a significant impact on the pornography industry.

“The way the First Amendment’s been adopted, in this area, makes it virtually impossible to just substantially reduce their footprint because the First Amendment and the case law developed around it doesn’t contemplate that,” Lee said.

The Miller test has become the standard for determining obscenity and plays a major role in the courts ruling on cases around pornography.

The case of Miller v. California dealt with a man who was arrested after mass-mailing a brochure with explicit, graphic depictions of men and women. The case was taken all the way up through the Supreme Court and the court overturned his initial conviction and developed a three-pronged approach to determining what’s obscene, now called the Miller test.

If an average person would think the work as a whole appeals to prurient (or lascivious) interests, and the work depicts sexual conduct in a patently offensive way and also lacks serious literary, artistic, political or scientific value, then it’s considered obscene. All three conditions need to be satisfied.

As the decades have passed, the landscape has changed and as Lee put it, there’s a more pronounced sense of “the urgency of the crisis.” But Lee believes his bills a good step forward to tackle the issue.

“The minute you start cracking down on this industry anywhere, I think it’s the minute that it gets easier to put a lot of these bad actors out of business.”