Mandates development and updates of AI and social media impact toolkits on youth. Requires stakeholder consultation. Specifies evidence-based content on digital resilience. Directs tailored guidance inclusion. Instructs dissemination through institutions. Authorizes $2,000,000 for implementation.
Analysis summaries, actor details, and coverage mappings were LLM-classified and may contain errors.
This is a binding federal statute enacted by the United States Congress with mandatory obligations on federal agencies (the Departments of Education and Health and Human Services) to develop, update, and disseminate educational toolkits. The Act uses mandatory language throughout and includes specific appropriations for implementation.
The document primarily addresses risks in the Discrimination & Toxicity domain (1.2, 1.3), Human-Computer Interaction domain (5.1, 5.2), and Misinformation domain (3.1). It focuses on protecting youth from toxic content, overreliance on AI, loss of agency, and exposure to false information. There is minimal coverage of privacy concerns (2.1) and malicious actor risks (4.1, 4.3). The document does not substantively address system safety failures, governance structures, or socioeconomic impacts.
This document primarily governs the Educational Services sector by mandating development and dissemination of AI and social media educational toolkits to schools, educators, and educational agencies. It also has significant coverage of the Health Care and Social Assistance sector through requirements for health care providers serving pediatric patients. The Public Administration sector is governed as the implementing agencies are federal departments.
The document does not directly govern AI development lifecycle stages. Instead, it mandates the creation of educational toolkits about AI and social media impacts on youth. The focus is on educating stakeholders about AI systems that are already deployed and in operation, particularly regarding their effects on students' mental health and digital resilience.
The document explicitly mentions AI systems, generative AI, and companion chatbots. It does not define specific AI categories like frontier AI, general purpose AI, or foundation models. There are no compute thresholds mentioned. The focus is on AI applications affecting youth, particularly social media platforms and chatbot technologies.
United States Congress
The document is a Congressional Act proposed and enacted by the United States Congress, as indicated by the legislative format and structure.
Department of Education; Department of Health and Human Services; Secretary of Health and Human Services; Secretary of Education
The Secretaries of Education and Health and Human Services are designated as the enforcement bodies responsible for implementing the Act's requirements, including development, updating, and dissemination of toolkits.
Department of Education; Department of Health and Human Services
The Departments of Education and Health and Human Services are implicitly responsible for monitoring implementation through their biennial update requirements and ongoing dissemination obligations, though no explicit independent monitoring body is designated.
Department of Education; Department of Health and Human Services; State educational agencies; local educational agencies; elementary schools; secondary schools; Bureau-funded schools; educators; specialized instructional support personnel; health care providers serving pediatric patients; students; parents; guardians; caregivers; school or educational agency administrators
The Act primarily targets federal agencies (Departments of Education and Health and Human Services) with mandatory obligations to develop toolkits. Secondary targets include educational institutions, educators, healthcare providers, students, and families who are the intended recipients and users of the toolkits.
11 subdomains (2 Good, 9 Minimal)