A senior Australian lawyer filed court submissions in a murder case that included fake quotes and nonexistent case judgments generated by artificial intelligence, causing a 24-hour delay in the case resolution.
In the Supreme Court of Victoria, Australia, defense lawyer Rishi Nathwani, who holds the title of King's Counsel, filed submissions in a murder case involving a teenager that contained AI-generated fabricated content. The fake submissions included fabricated quotes from a speech to the state legislature and nonexistent case citations purportedly from the Supreme Court. The errors were discovered by Justice James Elliott's associates who couldn't find the cited cases and requested copies from the defense lawyers. The lawyers then admitted the citations 'do not exist' and that the submission contained 'fictitious quotes.' The AI-generated errors caused a 24-hour delay in resolving the case, which Elliott had hoped to conclude on Wednesday but was delayed until Thursday. The lawyers explained they checked initial citations for accuracy but wrongly assumed others would also be correct. Justice Elliott noted that the Supreme Court had released guidelines last year requiring that AI use must be 'independently and thoroughly verified.' The court documents do not identify which generative AI system was used by the lawyers.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that inadvertently generate or spread incorrect or deceptive information, which can lead to inaccurate beliefs in users and undermine their autonomy. Humans that make decisions based on false beliefs can experience physical, emotional or material harms
Human
Due to a decision or action made by humans
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed