A truck driver was wrongfully arrested and detained for 11 hours after a casino's AI facial recognition system incorrectly identified him as a banned individual with 100% confidence, leading to false arrest despite valid identification documents.
In September 2023, truck driver Jason Killinger stopped at the Peppermill Casino in Reno and was misidentified by the venue's AI facial recognition technology as a man who had been banned from the venue months earlier for sleeping on the premises. The AI system flagged Killinger with a '100% match' to the banned individual, identified only as 'M.E.' in court documents. Casino security detained Killinger, and rookie police officer R. Jager arrested him despite Killinger providing valid Nevada driver's license, UPS pay stub, and vehicle registration all bearing his name and matching physical descriptors. Officer Jager refused to believe Killinger's identity documentation and accused him of having fraudulent IDs. Killinger was detained for 11 hours total, with 4 hours spent handcuffed, resulting in bruises and shoulder pain. He was ultimately freed after a fingerprint check confirmed his true identity. Killinger filed a wrongful arrest lawsuit, having already settled with the casino for an undisclosed amount, and is now pursuing legal action against the officer for fabricating evidence and malicious prosecution.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed