Springer Nature published a book on AI ethics containing dozens of fabricated academic citations that appear to have been generated by AI tools, with over 70% of citations in some chapters being unverifiable or completely invented.
Springer Nature, one of the world's largest academic publishers, published a book titled 'Social, Ethical and Legal Aspects of Generative AI' priced at £125 that contains numerous fabricated academic citations likely generated by AI. Analysis by experts including Guillaume Cabanac from the University of Toulouse and Dr Nathan Camp from New Mexico State University found that multiple chapters contained citations to non-existent journals and invented research papers. In one chapter, 8 of 11 citations could not be verified, suggesting over 70% were fabricated. Another chapter showed 11 of 21 citations could not be matched to known academic papers. The fabricated citations included references to journals that do not exist, such as the 'Harvard AI Journal.' This represents a case of research misconduct involving falsification and fabrication of references. The incident follows a previous case in April where Springer Nature had to withdraw another technology book titled 'Mastering Machine Learning: From Basics to Advanced' for containing fictitious references. Each chapter was written by different authors, with some chapters appearing accurate while others contained the fabricated citations.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that inadvertently generate or spread incorrect or deceptive information, which can lead to inaccurate beliefs in users and undermine their autonomy. Humans that make decisions based on false beliefs can experience physical, emotional or material harms
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed
No population impact data reported.