Require state agencies to follow the AI Framework, which emphasizes principles such as human oversight, transparency, security, data privacy, diversity, auditing, and workforce empowerment. Obligate agencies to maintain AI inventories and conduct risk assessments using the NIST AI Risk Management Framework.
Analysis summaries, actor details, and coverage mappings were LLM-classified and may contain errors.
This is a binding regulatory framework with mandatory requirements for state agencies, using enforcement language ('must', 'shall', 'required') and establishing specific compliance obligations including AI inventories and risk assessments.
The document has good coverage of approximately 8-10 subdomains, with strong focus on privacy and security (2.1, 2.2), discrimination and fairness (1.1, 1.3), transparency (7.4), system safety and robustness (7.3), governance (6.5), and human oversight (5.2). Coverage is concentrated in privacy, security, fairness, and AI system reliability domains.
This framework governs AI use across all North Carolina state government agencies and operations, which spans multiple sectors including public administration, healthcare and social services, educational services, and other government functions. The document does not regulate private sector entities but rather internal state government AI applications.
The document comprehensively covers all AI lifecycle stages from planning through ongoing monitoring. It explicitly addresses design principles, data governance, model development and acquisition, pre-deployment testing and validation, deployment procedures, and continuous monitoring and auditing requirements.
The document uses the broad term 'AI' and 'artificial intelligence' throughout without distinguishing between specific AI types such as foundation models, generative AI, or general purpose AI. It defines AI as 'an engineered system where machines learn from experience' and 'automated decision-making.' There is no mention of compute thresholds, frontier AI, or open-weight models. The framework applies to 'all AI' used by state agencies.
State Chief Information Officer of North Carolina, North Carolina Department of Information Technology
The document is issued under the authority of the State Chief Information Officer pursuant to N.C.G.S. § 143B-1376, which grants responsibility for security and privacy of all state information technology systems.
State Chief Information Officer, Enterprise Security and Risk Management Office (ESRMO), Office of Privacy and Data Protection (OPDP), North Carolina Department of Information Technology
The State CIO has statutory authority for security and privacy oversight, with ESRMO and OPDP providing guidance on risk assessments and compliance monitoring through inventory reporting.
State Chief Information Officer, Enterprise Security and Risk Management Office (ESRMO), Office of Privacy and Data Protection (OPDP), North Carolina Department of Information Technology, individual State Agencies (self-monitoring)
Monitoring occurs at multiple levels: agencies must monitor and audit their own AI systems, report inventories to NCDIT, and undergo ongoing risk assessment with guidance from ESRMO and OPDP.
State Agencies as defined in N.C.G.S. § 143B-1320(a)(17), including agency personnel responsible for design, development, acquisition, and use of AI, and third parties developing AI on behalf of agencies
The framework explicitly applies to all State Agencies and covers AI designed, developed, acquired, or used by state agencies, including third parties working on behalf of agencies.
9 subdomains (7 Good, 2 Minimal)