So much of political rhetoric and marketing copy is platitudes and bumper sticker slogans, making it a potentially ripe industry to be disrupted by artificial intelligence. But should voters be worried?

“I think it’s useful to think of ChatGPT and generative AI in general as a cliché generator,” or “autocomplete on your phone if it was scaled 10,000 times,” said David Karpf of ChatGPT, a chatbot that answers user questions and follows prompts that was released last November.

“Most of what we do in politics is also cliché generation,” said Karpf, an associate professor of media and public affairs at George Washington University, during an online panel on AI in politics with the Project on Ethics in Political Communication on Friday. “We send out during election season thousands of thousands of emails and we A/B test those within an inch of our lives.”

Generative AI has the potential to take over the task of churning out banalities for political professionals, writing text for everything from fundraising emails and text messages to social media posts and speech scripts. But not so fast, Karpf said. Campaigns and consultants don’t want to be caught making the first error because the risk factor and potential reputational downside is so high if they get it wrong, he said.

“You don’t want to become famous as the political consultant or the political campaign that blew it because you decided that you could have a generative AI do this for you,” he said.

Related
Perspective: ChatGPT and the dawn of the new Dark Ages
Should we be afraid of chatbots?

That’s already happened in journalism. CNET, a tech industry news site, used AI to write articles but put the program on hold in January after having to issue corrections for 41 of the 77 AI stories published.

Initial use cases for AI in political communication might be for early drafts that need human editing before they’re ready for voters. Zainab Chaudary, senior vice president at New Heights Communications, a political advocacy communications firm, said generative AI can be used to create prompts and the results are “fine,” but “definitely not something we would present to a client.”

“It’s probably equivalent to going on Wikipedia, doing your research, looking online for your resources, then using that to build your draft,” Chaudary said.

AI tools are limited by not being able to think outside the box of what’s already been written, which could be another problem for political professionals.

“It just regurgitates back at us whatever is available and sometimes what is available has not been working,” she said.

These tools also can’t replace a person’s own lived experience and the perspective it gives them, said Yvonne Liccione, a political communication student at GWU’s school of media and public affairs.

“There’s a level of human filtering that ChatGPT can’t do,” she said.

ChatGPT broke the record for fastest-growing consumer app earlier this year, according to research by UBS, a bank. The AI chatbot grew to 100 million active users by January, two months after it launched.