- High profile Senate campaigns are using AI to produce entire video ads.
- AI chatbots are providing candidate talking points in private text messages.
- Utah requires ads to disclose AI, but there is still not a federal policy.
Voters in one of the highest stakes U.S. Senate races this year started seeing official campaign advertisements that were fully or mostly generated by artificial intelligence in January.
Months later it is rapidly becoming the norm.
First, an ad of incumbent Texas Sen. John Cornyn, a conservative Republican, dancing with Democratic Senate candidate Jasmine Crockett, doing the “Senate Swing” and “Washington Waltz.”
The 40-second clip, paid for by Cornyn’s GOP primary challenger, Texas Attorney General Ken Paxton, was obviously fabricated, but it still contained a disclosure in small print that it was generated by AI.
A few months later, in March, Cornyn hit back with an entirely AI-generated video depicting Paxton driving with his alleged mistresses and receiving donations from the convicted felon Nate Paul.
The ad lacked a disclosure — as did another released days earlier by the National Republican Senatorial Committee, showing Texas Democratic Senate candidate James Talarico reading his own past social media posts.
These are the most visible examples of a new technological reality that is quickly shifting campaign operations around the country, including in Utah, to expand voter outreach and content production.
“The pace of change is faster than most people in this business are prepared for, and certainly faster than our regulatory frameworks are prepared for,” said Ben Haynes, senior partner at Elevate Strategies.
AI, according to its advocates, will help candidates connect with voters, and will help voters learn about candidates. Critics warn AI has the tendency to do this while violating privacy and manipulating the facts.
Technology has shifted politics many times over the past century. What makes this different, industry experts say, is AI can remove people from the equation, and politics, without people becomes something totally different.
What could positive AI campaigns look like?
AI has created a new frontier for political campaigns, and Tom Carroll is on the bleeding edge. Carroll is the co-founder of Convos — an AI product that allows campaigns to create a personalized chatbot to boost their candidate.
Convos contracts with campaigns to send mass texts with the hope of sparking AI conversations that win voters’ support through AI models trained entirely on a candidate’s website, past remarks or favorite talking points.
“We’re just as much a tool for the voter,” Carroll said. “Everybody who gets a text from us now for the rest of the campaign can hit that number back at any time and ask any question they have about the campaign and get an answer immediately.”
Convos was used in Wisconsin’s Supreme Court race, where the Democrat-aligned Defend Our Courts PAC hired them to message more than 1 million voters, generating more than 10,000 conversations in less than three weeks.
Chris Taylor, a former Democratic state legislator, carried the race by a large margin. To help get her across the finish line, Convos trained their AI product on her voting history, policy positions and election details voters asked about.
Multiple recent studies have found that AI large language models, or LLMs, can persuade people to believe true or false statements better than humans and can shift voter preferences more than traditional video ads.
It also improves campaigns, according to Carroll, who said that Convos is able to give campaigns detailed information on which questions, concerns and sources of frustration emerge most commonly from AI conversations.
“The campaign needs to be able to deliver that personalization at scale,” Carroll said.
This personalized approach will become necessary in the AI era of politics, Carroll said, but it will only work if campaigns are transparent about the fact that their campaign outreach is driven by AI instead of volunteers.
What voters need to know about AI in politics
AI will make campaigns more convenient for consultants — it is already being used by many to generate speech drafts, social media statements and opponent research, according to Matt Lusty, founding partner at Election Hive.
But he believes AI might simultaneously make campaigns more confusing for voters.
A paper published last year found that LLMs that were programmed with partisan bias could significantly shape the opinions of voters — whether or not the voters initially aligned with the partisan preference of the AI model.
While 29 states have either banned political “deepfakes” — AI-generated images, videos or voices — or require AI creation to be disclosed, like in Utah, this does not affect how AI chatbots choose to describe candidates.
In 2024, Lusty said he asked ChatGPT which candidate he should vote for as a “MAGA Republican.” It recommended a candidate based on an erroneous claim that the candidate had the endorsement of President Donald Trump.
“From a consumer perspective, you have to ask yourself, ‘Wait a minute, is it getting harder and harder to tell what’s truth these days and what’s spin and what’s not spin?’” Lusty told the Deseret News. “AI is not always right.”
From the standpoint of campaigns, AI has many potential benefits. Against establishment opposition, New York City Mayor Zohran Mamdani made use of AI chatbots to organize volunteers and attract social media influencers.
In this way, AI may help lower-budget candidates reach a broader audience. But as campaigns lean in, AI disclosure must become the norm, Haynes said, even though there is currently no federal regulation on AI transparency.
Some campaigns already use AI to craft entire websites, mail pieces and speeches. Voters tend to detect the artificiality, Haynes said. But even if AI improves to the point where voters can’t, candidates should still limit AI.
“It’s important to note that campaigns and politics should always be about people,” Haynes told the Deseret News. “That is the beauty of this work, and no LLM or tech can replace that.”

