Alaska's Education Commissioner used generative AI to draft a policy document on cellphone restrictions in schools, which resulted in false academic citations being included in an official state resolution that was reviewed by the Board of Education.
Alaska Education Commissioner Deena Bishop used generative artificial intelligence to draft a proposed policy on cellphone use in schools, resulting in a state document that cited supposed academic studies that do not exist. The document was not disclosed as AI-generated and contained at least four false citations that appeared to be studies published in scientific journals. The false citations followed typical patterns of AI hallucination, with real journal names but fabricated article titles and incorrect URLs that led to different articles on unrelated topics. A department spokesperson initially called the false sources 'placeholders' during the drafting process. Commissioner Bishop later acknowledged using generative AI to create citations and said she realized the error before a state Board of Education meeting, sending correct citations to board members. However, vestiges of AI-generated false information remained in the corrected document that was voted on by the board. The incident highlights concerns about AI misinformation influencing state policy and the lack of disclosure requirements for AI use in government documents.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that inadvertently generate or spread incorrect or deceptive information, which can lead to inaccurate beliefs in users and undermine their autonomy. Humans that make decisions based on false beliefs can experience physical, emotional or material harms
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed