The 2024 election cycle saw artificial intelligence deployed by political campaigns for the very first time. While candidates largely avoided major mishaps, the tech was used with little guidance or restraint. Now, the National Democratic Training Committee (NDTC) is rolling out the first official playbook making the case that Democratic campaigns can use AI responsibly ahead of the midterms.
In a new online training, the committee has laid out a plan for Democratic candidates to leverage AI to create social content, write voter outreach messages, and research their districts and opponents. Since the NDTC’s founding in 2016, the organization says, it has trained more than 120,000 Democrats seeking political office. The group offers virtual lessons and in-person bootcamps training would-be Democratic politicians on everything from ballot registration and fundraising to data management and field organizing. The group is largely targeting smaller campaigns with fewer resources with its AI course, seeking to empower what could be five-person teams to work with the “efficiency of a 15 person team.”
“AI and responsible AI adoption is a competitive necessity. It's not a luxury,” says Donald Riddle, senior instructional designer at the NDTC. “It's something that we need our learners to understand and feel comfortable implementing so that they can have that competitive edge and push progressive change and push that needle left while using these tools effectively and responsibly.”
The three-part training includes an explanation on how AI works, but the meat of the course revolves around possible AI use cases for campaigns. Specifically, it encourages candidates to use AI to prepare text for a variety of platforms and uses, including social media, emails, speeches, phone-banking scripts, and internal training materials that are reviewed by humans before being published.
The training also points out ways Democrats shouldn’t use AI and discourages candidates from using AI to deepfake their opponents, impersonate real people, or create images and videos that could “deceive voters by misrepresenting events, individuals, or reality.”
“This undermines democratic discourse and voter trust,” the training reads.
It also advises candidates against replacing human artists and graphic designers with AI to “maintain creative integrity” and support working creatives.
The final section of the course also encourages candidates to disclose AI use when content features AI-generated voices, comes off as “deeply personal,” or is used to develop complex policy positions. “When AI significantly contributes to policy development, transparency builds trust,” it reads.
These disclosures are the most important part of the training to Hany Farid, a generative AI expert and UC Berkeley professor of electrical engineering.
“You need to have transparency when something is not real or when something has been wholly AI generated,” Farid says. “But the reason for that is not just that we disclose what is not real, but it's also so that we trust what is real.”
When using AI for video, the NDTC suggests that campaigns use tools like Descript or Opus Clip to craft scripts and quickly edit content for social media, stripping videoclips of long pauses and awkward moments.
The free course was created in collaboration with the Higher Ground Institute, the nonprofit arm of the progressive tech incubator Higher Ground Labs. The NDTC and the Higher Ground Institute plan to refresh the training as new tools and use cases develop.
"Our approach focuses on turning fear into a force multiplier—thousands of Democratic campaigns can now leverage AI to compete at any scale, everywhere,” says Kelly Dietrich, founder and CEO of the NDTC. “We have an incredible opportunity to take back power in 2026 and save our country from MAGA fascism.”
The NDTC course is the first major attempt to teach Democrats how to bolster their campaigns with AI. Last cycle, Democrats used AI to handle routine tasks like drafting fundraising emails but generally refrained from using it for more strategic functions. In the training, the NDTC argues that Democrats are falling behind while Republicans have already incorporated the tech across campaign functions.
“We need some campaigns to really invest in it and try it. We have yet to see Democratic campaigns where it’s integrated at every level,” says Kate Gage, cofounder of the Higher Ground Institute and executive director at the Cooperative Impact Lab.
During the 2024 election, Republican campaigns, including President Donald Trump’s campaign, were eager to integrate AI in their efforts. Groups supporting Republican Florida governor Ron Desantis published videos with AI-generated planes and fake audio of Trump in campaign ads and social posts. Weeks before the election, Trump shared a deepfaked image of Taylor Swift endorsing the then Republican nominee for president. The GOP spent more than $1.2 million total on Campaign Nucleus and its AI tools alone over the last election, according to Bloomberg Government. The company, founded by former Trump campaign manager Brad Parscale, offers AI tools to target ads and automate tedious tasks.
“It's an interesting question as to whether both sides of the political aisle will play with the same rules,” says Farid. “The parties don't operate with the same rules, and so I don't think we're necessarily going to see this from the other side of the political aisle, and I think that's going to complicate this whole equation.”