Amends the Consolidated Appropriations Act, 2022, to define "digital forgery" as AI-created intimate visual depictions. Allows civil action for non-consensual digital forgeries. Specifies damages and relief. Permits privacy protections. Establishes a 10-year statute of limitations. Ensures non-preemption.
Analysis summaries, actor details, and coverage mappings were LLM-classified and may contain errors.
This is a binding federal statute enacted by the United States Congress that amends existing law to create civil causes of action with specific enforcement mechanisms, damages, and legal remedies for violations involving AI-generated intimate imagery.
The document has good coverage of approximately 4-5 subdomains, with strong focus on malicious actors (4.1, 4.3), privacy compromise (2.1), toxic content (1.2), and misinformation (3.1). Coverage is concentrated in privacy violations, malicious use of AI for creating non-consensual intimate imagery, and fraud/manipulation domains.
This Act does not govern AI use within specific economic sectors. Rather, it establishes civil liability for individuals and entities across all sectors who create, possess, or distribute non-consensual AI-generated intimate imagery. The regulation is cross-sectoral, applying to any person or organization regardless of industry.
The document primarily addresses the Deploy and Operate and Monitor stages by regulating the disclosure, distribution, and use of AI-generated intimate imagery. It does not focus on the development, training, or validation of AI systems themselves, but rather on the harmful outputs and their distribution.
The document explicitly mentions AI systems and machine learning as technologies used to create digital forgeries. It does not distinguish between different types of AI (frontier, general purpose, task-specific) or mention compute thresholds, foundation models, or open-source models. The focus is on generative AI capabilities that create intimate visual depictions.
United States Congress
The document is a federal Act proposed and enacted by the United States Congress, as indicated by the legislative format and citation structure.
United States District Courts
Enforcement is conducted through the federal court system, specifically United States District Courts, which have jurisdiction to hear civil actions and issue remedies.
The Act does not establish a specific monitoring body or oversight mechanism. Enforcement relies on private civil actions brought by affected individuals rather than proactive government monitoring.
The Act targets any person who produces, possesses, discloses, or solicits AI-generated intimate imagery without consent. This includes AI developers who create such tools, users who deploy them for harmful purposes, and affected individuals whose likenesses are used.
5 subdomains (4 Good, 1 Minimal)