Tromsø municipality in Norway used an AI chatbot to write a 120-page report recommending school closures, which contained 11 fabricated references out of 18 total citations, leading to public outcry and suspension of the closure plan.
In Tromsø, Norway, the local administration prepared a 120-page report recommending the closure of several schools and kindergartens due to declining enrollment. Municipal director Stig Johnsen admitted the report was written using AI after local residents discovered that 11 out of 18 references cited in the report did not exist. One fabricated reference was to a non-existent book titled 'Quality in School: Learning, Well-being and Relationships' by Professor Thomas Nordahl, who confirmed he never wrote such a book. The AI-generated misinformation upset local residents who were already opposed to the school closures, viewing the administration's justifications as inconsistent. The scandal has been called 'perhaps the first major AI scandal in the Norwegian public sector.' Mayor Gunnar Wilhelmsen expressed shock at the incident, and the shutdown plan has been halted temporarily. The administration plans to restart the process later this year. Norway's Minister for Digitalisation, Karianne Tung, controversially congratulated the municipality on using AI despite the scandal.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that inadvertently generate or spread incorrect or deceptive information, which can lead to inaccurate beliefs in users and undermine their autonomy. Humans that make decisions based on false beliefs can experience physical, emotional or material harms
Human
Due to a decision or action made by humans
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed