

 
 INSIGHTi 
 
Artificial Intelligence (AI) and Campaign 
Finance Policy: Recent Developments 
Updated March 18, 2024 
No federal statute or regulation specifically addresses artificial intelligence (AI) in political campaigns. 
The Federal Election Campaign Act (FECA) and Federal Election Commission (FEC or commission) 
regulations govern conduct that calls for election or defeat of federal candidates or solicits funds affecting 
federal election campaigns. They also regulate some advertisements (electioneering communications) that 
refer to clearly identified federal candidates during preelection periods but do not call for election or 
defeat.  Disclaimer requirements that mandate attribution for communications regulated by campaign 
finance law appear to apply to ads created with AI. However, those requirements do not mandate that such 
advertising alert the audience, or regulators, to the presence of AI-generated content. Campaign 
management decisions, such as which technology to use, are generally not subject to regulation. 
This updated CRS Insight discusses recent developments that could be relevant as Congress monitors or 
considers legislation related to AI and campaign finance policy. It does not address legal issues. Other 
CRS products provide information on topics such as generative AI and AI policy areas other than 
campaign finance that could be relevant for Congress. 
AI in Political Campaigns, and Recent Legislative Developments 
Recent policy attention to AI in campaigns focuses on “deepfakes,” referring to artificially manipulated 
audio or video content in political advertising. Such advertising appears to present new challenges for 
campaigns and voters about how to determine whether communications are authentic.  
Recent legislation proposes disclaimers, reporting requirements, or prohibitions on deepfakes in federal 
campaigns or elections. Bills introduced in the 118th Congress include H.R. 3044; H.R. 3106; H.R. 3831; 
H.R. 4611; H.R. 5586; S. 686; S. 1596; S. 2770; and S. 3875. Legislation (H.R. 1; H.R. 5314) addressing 
various elections topics, including some provisions concerning deepfakes, passed the House in the 117th 
Congress but was not enacted. 
In May 2023, the American Association of Political Consultants (AAPC), a trade association representing 
political professionals, issued a statement explaining that its board of directors had unanimously 
“condemn[ed] use of deceptive generative AI content in political campaigns” and noted that such 
communications were inconsistent with the organization’s code of ethics. The AAPC position represents a 
Congressional Research Service 
https://crsreports.congress.gov 
IN12222 
CRS INSIGHT 
Prepared for Members and  
 Committees of Congress 
 
  
 
Congressional Research Service 
2 
voluntary professional standard, not a regulatory requirement. (The AAPC has also stated its support for a 
February 2024 Federal Communications Commission [FCC] prohibition on automated political telephone 
calls [robocalls] created using generative AI—a topic that is otherwise beyond the scope of this Insight.)  
Despite the focus on AI’s role in political advertising, AI also can serve campaign-management functions. 
For example, political professionals or volunteers could use AI to automate, or supplement human labor 
to complete, various internal campaign tasks. According to media reports, campaigns are already using AI 
to perform large-scale data analysis, compile opposition research, or draft targeted fundraising appeals. 
Federal Election Commission Rulemaking Petitions 
On June 22, 2023, members of the FEC deadlocked (3-3) on whether to issue a notice of availability 
(NOA) to receive comments on an AI rulemaking petition from the interest group Public Citizen. Citing 
the potential for AI-generated ads to “provide political operatives with the means to produce campaign 
ads with computer-generated fake images of candidates,” the request asked the FEC to issue rules 
specifying that the fraudulent misrepresentation of campaign authority prohibition codified at 52 U.S.C. 
§30124 applied to AI-generated ads. At the June 22 meeting, some commissioners expressed skepticism 
about the agency’s statutory authority to regulate AI ads; others expressed support for a rulemaking. On 
July 13, 2023, several Members of Congress wrote to the commission expressing “disappoint[ment]” with 
the FEC’s action and requested additional information. Also on July 13, Public Citizen submitted a new 
rulemaking petition.  
The commission considered the new petition on August 10, 2023. In this case, it approved a NOA. The 
limited discussion at the August 10 meeting suggested that at least some commissioners continue to have 
reservations about the commission’s authority concerning regulating AI ads in particular; about the 
appropriateness of the FECA fraudulent misrepresentation provision as an avenue to regulating AI ads; or 
both.  
The comment period closed on October 16, 2023. The NOA provided an opportunity for the public, or 
Members of Congress, to comment on these or other questions (available here on the FEC website under 
document REG 2023-02). Fifty-two Members of Congress submitted joint comments encouraging the 
FEC to adopt rules specifying that the FECA fraudulent-misrepresentation provisions apply to ads created 
using generative AI; and to require disclaimers on ads created with the technology. Several interest 
groups, scholars, and private citizens also submitted comments both supporting and opposing a 
rulemaking. It is unclear when or whether the commission might take additional action.  
Potential Policy Considerations for Congress 
If pursuing legislation, Congress might need to determine whether to do so narrowly, such as by 
addressing specific AI issues, or to also address other campaign finance or elections topics. Congress has 
pursued both approaches to campaign finance regulation recently. If Congress chose to task the FEC with 
pursuing rulemaking without also providing additional statutory guidance, it is possible that the 
commission would be unable to agree, with the four of six minimum required votes, about how to 
proceed. 
Maintaining the status quo likely would reinforce the emerging debate about whether additional 
regulation is needed, including about what role industry should play. This approach could have the 
advantage of providing time to gather additional information about how AI evolves during the 2024 
election cycle and where legislative coalitions might exist. It could have the disadvantage of delaying 
opportunities to clarify how or whether Congress intends existing or future legislative or regulatory 
options to apply to AI in campaigns and elections. Congress could also require agency (or congressional
  
Congressional Research Service 
3 
committee or task force) study of AI issues before, or in addition to, other policymaking, as some recent 
legislation proposes (or has required in non-campaign finance matters).  
Amending FECA would be a typical approach to further regulate ads that are made by political 
committees, solicit funds, engage in express advocacy, or refer to federal candidates through 
electioneering communications. Although Congress could also amend FECA or another statute to require 
disclaimers on ads that do not meet those requirements (e.g., issue advocacy), federal campaign finance 
law currently generally does not regulate issue advocacy. Prohibiting AI-generated ads might raise First 
Amendment concerns, such as those discussed in another CRS campaign finance product.  
 
 
Author Information 
 
R. Sam Garrett 
   
Specialist in American National Government 
 
 
 
 
Disclaimer 
This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan shared staff 
to congressional committees and Members of Congress. It operates solely at the behest of and under the direction of 
Congress. Information in a CRS Report should not be relied upon for purposes other than public understanding of 
information that has been provided by CRS to Members of Congress in connection with CRS’s institutional role. 
CRS Reports, as a work of the United States Government, are not subject to copyright protection in the United 
States. Any CRS Report may be reproduced and distributed in its entirety without permission from CRS. However, 
as a CRS Report may include copyrighted images or material from a third party, you may need to obtain the 
permission of the copyright holder if you wish to copy or otherwise use copyrighted material. 
 
IN12222 · VERSION 4 · UPDATED