Amends the Controlled Substances Act to require electronic communication and remote computing service providers to report suspected illegal activities involving controlled substances to the DEA. Permits algorithmic content moderation, imposes penalties for noncompliance, and mandates annual reporting by the DEA.
Analysis summaries, actor details, and coverage mappings were LLM-classified and may contain errors.
This is a binding federal statute enacted by the United States Congress that amends the Controlled Substances Act, imposing mandatory reporting requirements on electronic communication service providers with explicit criminal and civil penalties for non-compliance.
The document has good coverage of approximately 6-8 subdomains, with strong focus on malicious actors (4.1, 4.2, 4.3), AI system security (2.2), privacy compromise (2.1), governance failure (6.5), and lack of transparency (7.4). Coverage is concentrated in security, misuse prevention, privacy, and governance domains.
The document primarily governs the Information sector, specifically electronic communication service providers and remote computing services (telecommunications, data processing, social media platforms). It also has secondary applicability to Health Care and Social Assistance through regulation of illegal prescription drug distribution online.
The document primarily addresses the Deploy and Operate and Monitor stages of the AI lifecycle. It focuses on operational requirements for content moderation systems (both human and algorithmic) that are already deployed, requiring reporting when these systems detect illegal drug-related activities. The document does not address earlier lifecycle stages such as planning, data collection, or model building.
The document explicitly mentions algorithmic content moderation and machine learning methods used by electronic communication service providers. It does not define or specifically mention AI models, AI systems, frontier AI, general purpose AI, foundation models, generative AI, or compute thresholds. The focus is on the operational use of algorithms and machine learning for content moderation purposes.
United States Congress
The document is titled 'Cooper Davis Act' and is structured as federal legislation with sections beginning 'This Act may be cited as' indicating Congressional authorship and proposal.
Attorney General of the United States and Drug Enforcement Administration (DEA)
The Act explicitly designates the Attorney General as the enforcement authority and the DEA as the receiving and investigating agency for reports, with authority to impose criminal and civil penalties.
Drug Enforcement Administration (DEA)
The DEA is required to publish annual reports tracking the number of reports received, their sources, outcomes, and enforcement actions, establishing it as the primary monitoring body for implementation and effectiveness.
Electronic communication service providers and remote computing service providers (as defined in 18 U.S.C. 2510 and 2711)
The Act explicitly targets 'electronic communication service providers' and 'remote computing service providers' requiring them to report suspected illegal activities involving controlled substances. These providers include platforms that offer communication and computing services where content moderation occurs.
7 subdomains (6 Good, 1 Minimal)