Requires political advertisements containing images or video generated with artificial intelligence to include clear disclaimers. Directs the Federal Election Commission (FEC) to define AI-generated content, issue implementing regulations, and report on compliance and enforcement.
Analysis summaries, actor details, and coverage mappings were LLM-classified and may contain errors.
This is a binding federal statute (H.R. 3044) introduced in the U.S. House of Representatives that amends the Federal Election Campaign Act of 1971. It contains mandatory requirements with enforcement mechanisms through the Federal Election Commission.
The document primarily addresses risks related to malicious actors using AI for disinformation (4.1), misinformation and information pollution (3.1, 3.2), and governance mechanisms (6.5). It has minimal coverage of AI system transparency (7.4). The focus is concentrated on preventing AI-generated content from being used to mislead voters in political contexts.
The document primarily governs the Information sector (online platforms, social networks, search engines, digital applications) and Professional and Technical Services sector (advertising agencies, third-party advertising vendors). It also has implications for Public Administration through its regulation of political campaign communications.
The document does not focus on specific AI development lifecycle stages. Instead, it regulates the deployment and use of AI-generated content (images and video footage) in political advertisements, requiring disclaimers when such content is used. The primary focus is on the operational use of AI outputs in a specific context (political advertising).
The document explicitly mentions 'artificial intelligence (generative AI)' and focuses specifically on generative AI capabilities for creating images and video footage. It does not mention AI models, AI systems, frontier AI, general purpose AI, task-specific AI, foundation models, predictive AI, open-weight models, or compute thresholds.
Ms. Clarke of New York, United States Congress (House of Representatives), Committee on House Administration
The bill was introduced by Representative Clarke of New York in the House of Representatives and referred to the Committee on House Administration. Congress is explicitly identified as the proposing body in the Sense of Congress section.
Federal Election Commission (FEC)
The FEC is explicitly designated as the enforcement body responsible for promulgating regulations, assessing compliance, and enforcing the requirements of this Act.
Federal Election Commission (FEC), United States Congress
The FEC is required to monitor compliance and submit biannual reports to Congress assessing compliance with and enforcement of the requirements. Congress receives these reports and oversees the FEC's implementation.
Candidates, political committees, online platforms (websites, web applications, digital applications, social networks, ad networks, search engines with 50,000,000+ monthly U.S. visitors), third-party advertising vendors, anyone making political advertisements containing AI-generated content
The document targets those who create, place, or promote political advertisements containing AI-generated images or video footage. This includes candidates, political committees, online platforms meeting size thresholds, and third-party advertising vendors who sell or buy advertisement space.
6 subdomains (2 Good, 4 Minimal)