The Brave New World of AI in Political Campaigns


March 28, 2024

Campaigns & Elections event panel

This story originally appeared in GW Today.

With the 2024 elections looming, artificial intelligence (AI) as a force in campaigns is already being regarded warily after being used in robocalls to mimic President Joe Biden during the New Hampshire primaries and to create attack ads by supporters of Florida Gov. Ron DeSantis (R) that misleadingly used the voice of former President Donald Trump.

In a panel discussion at the George Washington University titled “Ethics in Political Communication: Navigating a Shifting Landscape from Digital Outreach to AI,” political analysts said these examples are not representative of how AI is generally used by professional political strategists.

“AI is the topic of conversation nationally in politics 2024,” said School of Media and Public Affairs (SMPA) Director Peter Loge to an audience Tuesday evening in the SMPA Fifth Floor Studio. (The discussion also was streamed live.) “We are going to talk about how campaigns are using or not using it, the technical constraints and limitations…and what disruptions we can expect.”

These issues were taken up by Alisa Brady, a managing director on the optimization team of Targeted Victory, a Republican media firm that advises candidates and corporations; Crystal Patterson, the CEO and president of the Washington Media Group, a communications and political expert who has worked on the presidential campaign of Hillary Clinton and served as a policy expert with Facebook; and Sarah Mintzer, an SMPA junior political communications major who is studying AI’s development.

The discussion was the fifth in a series on the theory and practice of political campaigns in 2024 held by SMPA in partnership with magazine “Campaign and Elections.”

The analysts emphasized that AI is mainly being used politically at this point as a predictive rather than a generative tool to assess where voters are politically and ways to communicate with them more effectively.

Patterson explained that when she started campaign work nearly two decades ago, campaigns might send an email out once month. Now, the technology is used to operate more efficiently and ensure that what the campaigns produce for candidate clients works.

“[We may put] a memo into ChatGPT to see if we are highlighting the right points or if there are patterns, we are missing,” Patterson said. “You can turn around scripts for phone banking, for robocalls or for your candidates faster.”

Brady said she had carved out a particular area at Targeted Victory for using AI as a tool to measure the best messaging language that is then used by veteran political operatives to draft content to reach a specific audience.

“Our strategists are now presented with more options in terms of the types of language they can use to help them be more creative,” Brady said. “Our top copy writers, content specialists, the AI does not beat them. It just gives them one more phrase that they might not have come up with themselves or would have taken multiple hours of drafting.”

For Mintzer, this perspective is less frightening than the warnings she heard from professors about sticking to rules for using AI in classes and hearing stories that AI might impact, or even replace, people like her on jobs—a fear she said she since gotten over. “So I think it is good to absorb everything that is going on and just be mindful,” Mintzer said. “I wouldn’t say lucky is the right word, but it definitely is an interesting experience to get to be here in an election year and watch all of this take shape.”

Since AI is still relatively new, Brady said, the open-source environment tends to be collaborative and experimental, sharing what is learned. “People are posting their experiences with certain algorithms, making certain recommendations on things to try,” she said.

Loge noted that this all sounds benign and wondered how worried he should be, given reports of the AI generated newscasts of Channel One, the Biden robocalls and Trump attack ads. “It sounds like this can get very nefarious, very fast,” he said.

Patterson said that such fears are, in fact, “well founded and healthy.”

“I hope it helps breed skepticism that encourages people to get more information,” Patterson continued.

She explained that there may be a generational divide in how information generated by AI is consumed, which is a problem for social media in general. Young people who grew up with smartphones, she said, tend to understand naturally the dynamics of TikTok and Instagram. For older people, who vote more, said Patterson, “they are going to click on things and sign up for monthly donations when they don’t realize they are doing it. This is a brave new world for them.”