A South African law firm used AI-generated fictitious case citations in court proceedings, with 7 out of 9 cited cases being non-existent, leading to court sanctions and referral to the Legal Practice Council.
In September 2024, the Pietermaritzburg High Court discovered that Surendra Singh and Associates law firm had submitted fictitious case citations in a leave to appeal application for politician Philani Godfrey Mavundla. Judge Elsje-Marie Bezuidenhout found that of nine cases cited in the supplementary notice of appeal, only two existed, with one having an incorrect citation. The candidate attorney Ms Rasina Farouk initially denied using AI when questioned, but later it became apparent that AI applications like ChatGPT had been used for legal research without proper verification. The judge conducted her own ChatGPT search which incorrectly confirmed a non-existent case's details, demonstrating AI hallucination. The court imposed costs on the law firm and referred the matter to the Legal Practice Council for disciplinary action. A subsequent similar case, Northbound Processing v SA Diamond and Precious Metals Regulator, reinforced that good intentions and apologies do not excuse the fundamental breach of presenting fictitious citations, with courts establishing mandatory referral to professional bodies regardless of circumstances.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that inadvertently generate or spread incorrect or deceptive information, which can lead to inaccurate beliefs in users and undermine their autonomy. Humans that make decisions based on false beliefs can experience physical, emotional or material harms
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed