A 61-year-old man was falsely identified by facial recognition technology as an armed robber, leading to his wrongful arrest and sexual assault in jail before charges were dropped.
In January 2022, Harvey Murphy Jr., a 61-year-old man, was falsely accused of armed robbery at a Houston Sunglass Hut due to a faulty facial recognition match. Two armed men had robbed the store on January 22, 2022, and EssilorLuxottica's loss prevention agent Anthony Pfleger worked with Macy's to run surveillance footage through Macy's facial recognition system. The system incorrectly identified Murphy as one of the robbers, despite Murphy being in a Sacramento, California jail at the time of the robbery, nearly 2,000 miles away. Based on this AI identification, Houston police conducted a photo lineup with a sales associate who then identified Murphy as the robber. Murphy was arrested in October 2022 when he went to renew his driver's license at a DMV office. He was held in Harris County jail for approximately 10 days, during which time he was beaten and gang-raped by three inmates in a jail bathroom. Hours after the assault, prosecutors dropped all charges against him after confirming his alibi. Murphy has filed a $10 million lawsuit against Macy's and EssilorLuxottica, seeking damages for what he describes as 'lifelong injuries' from the assault.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed