Requires the Director of the National Institute of Standards and Technology (NIST) and the Director of the Office of Management and Budget to issue guidance on how federal agencies can use NIST’s AI Risk Management Framework to reduce risks.
Analysis summaries, actor details, and coverage mappings were LLM-classified and may contain errors.
This is a federal statute enacted by the United States Congress that creates binding legal obligations on federal agencies with mandatory compliance requirements, enforcement mechanisms, and reporting obligations.
The document has minimal direct coverage of specific risk subdomains. While it extensively references the NIST AI Risk Management Framework (which addresses multiple risks), the Act itself primarily establishes governance mechanisms and procedural requirements rather than explicitly describing specific AI risks or harms. The document shows implicit awareness of risks through its emphasis on risk management, testing, and validation, but does not detail specific risk scenarios.
This document governs AI use across all federal government agencies in the executive branch, which primarily falls under Public Administration. The requirements apply to government procurement and use of AI systems across all agency functions, with an explicit exception for national security systems.
The document comprehensively covers multiple AI lifecycle stages with particular emphasis on procurement, deployment, and operational monitoring. It addresses planning through framework adoption requirements, development through standards and testing protocols, and ongoing operations through monitoring and reporting obligations.
The document explicitly references 'artificial intelligence' throughout and defines it by reference to the National Artificial Intelligence Initiative Act of 2020. It addresses AI systems broadly in the context of federal agency use and procurement. The document does not distinguish between different types of AI (frontier, general purpose, task-specific, generative, predictive) or mention compute thresholds or open-weight models.
United States Congress
The document is titled 'Federal Artificial Intelligence Risk Management Act of 2023' and is a Congressional statute, indicating it was proposed and enacted by the United States Congress.
Office of Management and Budget (OMB); Director of OMB; Federal Acquisition Regulatory Council; Administrator of Federal Procurement Policy; National Institute of Standards and Technology (NIST); Director of NIST
The Act designates OMB to issue binding guidance requiring agency compliance, NIST to develop technical guidelines, and the Federal Acquisition Regulatory Council to promulgate procurement regulations. These entities have authority to establish and enforce compliance requirements.
Comptroller General of the United States; Government Accountability Office (GAO); Office of Management and Budget; Director of NIST; Congress
The Act establishes monitoring through GAO studies, OMB reporting to Congress, and NIST's ongoing review of testing and validation. Congress receives regular reports on agency implementation and conformity.
Federal agencies (departments, independent establishments, Government corporations, or other agencies of the executive branch); Suppliers of artificial intelligence to federal agencies; Small business concerns
The Act explicitly applies to federal agencies that develop, procure, and use artificial intelligence systems, as well as to suppliers who provide AI to these agencies. Small businesses are also specifically addressed in profile development requirements.
5 subdomains (1 Good, 4 Minimal)