Prohibits social media platforms from using algorithmic recommendation systems for users under 18, and bans access of users under 13. Requires age verification and parental consent for minors on social media, and establishes enforcement mechanisms and a secure digital identification credential pilot program.
Analysis summaries, actor details, and coverage mappings were LLM-classified and may contain errors.
This is a binding federal statute introduced in the U.S. Senate with mandatory obligations on social media platforms, explicit enforcement mechanisms through the Federal Trade Commission and state attorneys general, and civil penalties for violations.
The document primarily addresses risks in the Human-Computer Interaction domain (5.1, 5.2) with good coverage, focusing on protecting minors from overreliance and loss of agency on social media. It has minimal coverage of Privacy & Security (2.1) through age verification requirements, and Malicious Actors (4.3) through fraud prevention in the pilot program. The document does not substantially address AI-specific risks but rather focuses on social media platform governance for child protection.
The document primarily governs the Information sector, specifically social media platforms and digital services. It also has minimal coverage of Professional and Technical Services through the identity verification technology providers that may participate in the pilot program.
The document does not substantially address AI system lifecycle stages. While it regulates algorithmic recommendation systems on social media platforms, it focuses on restricting their use for minors rather than governing their development, deployment, or monitoring. The document is primarily concerned with age verification and parental consent mechanisms rather than AI system governance.
The document explicitly mentions algorithmic recommendation systems but does not reference AI models, AI systems, or any specific AI categories such as frontier AI, general purpose AI, foundation models, or generative AI. It does not mention compute thresholds or open-weight models. The focus is on social media platform functionality rather than AI technical specifications.
United States Congress; Senator Brian Schatz; Senator Tom Cotton; Senator Chris Murphy; Senator Katie Britt; Senate Committee on Commerce, Science, and Transportation
The bill was introduced in the U.S. Senate by Senator Schatz and co-sponsors, and referred to the Senate Committee on Commerce, Science, and Transportation for consideration.
Federal Trade Commission (FTC); State Attorneys General
The Act designates the Federal Trade Commission as the primary enforcement body, with explicit authority to enforce violations as unfair or deceptive practices. State attorneys general are also granted enforcement authority to bring civil actions on behalf of state residents.
Federal Trade Commission; State Attorneys General; Inspector General
The FTC and state attorneys general have monitoring authority through their enforcement powers, including investigatory powers. The Inspector General is specifically mentioned for oversight of the Pilot Program implementation.
Social media platforms operating in the United States
The Act explicitly targets social media platforms that offer services to users in the United States and allow users to create accounts to publish or distribute content. The definition encompasses online applications and websites meeting specific criteria.
4 subdomains (2 Good, 2 Minimal)