As technology advances, the addition of one word to state law might be vital to protect victims of images and videos altered to be sexually explicit.

Sen. Karen Kwan, D-Murray, introduced SB66, a bill that would amend the definition of counterfeit intimate image to add the word “generated.” In a public hearing Wednesday, Kwan said that the addition of this word is intended to “close a potential loophole with all the new AI technology.”

Artificial intelligence technology has advanced from even a couple of years back when Kwan said she ran a bill on counterfeit images. “At the time, I called them deep fakes,” she said. “But now they’re also AI and ChatGPT kinds of things.”

If the bill passes, the definition of counterfeit intimate image would read “any visual depiction, photograph, film, video, recording, picture, or computer or computer-generated image or picture, whether made or produced by electronic, mechanical, or other means, that has been edited, manipulated, generated, or alerted to depict the likeness of an identifiable individual” in sexually explicit ways that the law lists.

Kwan’s bill was unanimously recommended by the Senate Judiciary, Law Enforcement and Criminal Justice Committee Wednesday.

The bill comes amid growing questions about whether or not state and federal laws account for sexually explicit images generated by artificial intelligence.

New York case

One case occurred in New York City. A man named Patrick Carey was sentenced to six months in jail and 10 years probation after pleading guilty “to multiple felonies in the deepfake scheme, including promoting a sexual performance by a child, aggravated harassment as a hate crime and stalking,” per NBC New York.

“Patrick Carey targeted these women, altering images he took from their social media accounts and the accounts of their family members and manipulating them using ‘deepfake’ technology to create pornography that he disseminated across the internet,’ Nassau County District Attorney Anne Donnelly told NBC New York.

Donnelly told NBC New York that the explicit image of a minor discovered by investigators was “the only reason” he was sentenced to jail time. “New York State currently lacks the adequate criminal statutes to protect victims of ‘deepfake’ pornography, both adults and children,” Donnelly said.

The path to legal recourse for victims of deepfake pornography was something Tori Rousay, corporate advocacy program manager and analyst at the National Center on Sexual Exploitation, said she had concerns about, too.

“First, we don’t have a federal law addressing this problem,” Rousay told the Deseret News over the phone. “We don’t even have a federal statute addressing image-based sexual abuse in general or what some call revenge pornography.”

While states do have laws that address deepfake and AI-generated explicit imagery, Rousay said “one of the issues is that they define deepfake or digitally-altered images in very different contexts and most of them require that intent be shown of harm.” Showing intent of harm with these images can be “extremely difficult,” she explained, because these images are often generated anonymously.

Another issue is how the technology that can remove clothing off of a person and generate sexually explicit images and videos has spread quickly, Rousay said. In the future, she hopes that if a model or code is created with the intent to do these sorts of things, there be a policy in place that can remove the model or code off of the public interface and open source sites to curb its use.

As Rousay completed her master’s degree at Harvard, she spoke with victims of deepfake pornography who described how emotionally devastating it was to have images of them digitally altered to be sexually explicit.

“It’s also gendered violence,” Rousay said. With these images and videos, even if they’re taken down off of one website, she said it can be impossible to have peace of mind that they’re gone everywhere. “It never really ended even after the images were pulled down because you can’t really confirm if they’re there or not,” she said, explaining that victims felt like their suffering continued past the images being removed in one location.

Chris McKenna, founder of Protect Young Eyes, said “the ability for harm to be perpetrated against individuals will grow faster than we can control it with laws and policies given that an image can be slightly altered — a head can be animated into pornographic form.” The capacity of artificial intelligence to alter images is expansive.

“Our laws are already 10 steps behind and tomorrow the new thing will make them 12 steps behind,” McKenna said in a phone interview about the way the U.S. handles deepfake pornography. He explained that statutes sometimes do not account for the way that artificial intelligence has advanced so quickly. Many of the laws on the books are from decades past and it’s been difficult for state and federal law to keep pace with the changes, but McKenna emphasized how important it is to implement changes before more harm spreads.

View Comments

“Participants portrayed their experiences of image-based sexual abuse and sexual deepfakes as one of irreparable harm,” Rousay wrote in her thesis. “While participant experiences were marked by variations in context, age, gender, and medium, the complete and insurmountable devastation was unanimously experienced amongst all victim-survivors.”

Rousay documented that one victim called Ella said, “Within a split second of undertaking a reverse Google image search, my laptop screen was plastered with dozens of links to images of me on numerous pornographic sites across multiple pages of search results. My stomach sank to the floor. My heart was pounding out of my chest. My eyes widening in shock and disbelief. I was horrified.”

For many victims, the suffering doesn’t stop with the images.

“While there is an identifiable catalyst (creation) to future experiences of abuse, victim-survivors are unable to identify a definitive end to their abuse — because it doesn’t exist,” Rousay wrote in her thesis. “In some cases, victim-survivors were inundated with intrusive and harassing commentary in which they were blamed, shamed, and held responsible by the general online public, as well as their friends and families, for becoming pornography without their consent or knowledge for years after the initial event occurred.”

Looking for comments?
Find comments in their new home! Click the buttons at the top or within the article to view them — or use the button below for quick access.