A Tennessee grandmother spent nearly six months in jail after facial recognition AI software wrongly identified her as a suspect in North Dakota bank fraud cases she had no connection to.
Angela Lipps, a 50-year-old Tennessee grandmother, was arrested in July 2025 after Fargo police used facial recognition software to identify her as a suspect in bank fraud cases involving a woman using fake military ID to withdraw tens of thousands of dollars. The AI system, operated by West Fargo police using Clearview AI technology, identified Lipps as a potential match based on surveillance footage. Fargo police then reviewed her social media and driver's license photo, concluding she matched the suspect based on facial features, body type, and hairstyle. U.S. Marshals arrested Lipps at gunpoint while she was babysitting four children. She spent nearly four months in a Tennessee jail without bail as a fugitive, then was extradited to North Dakota in October 2025. Her attorney obtained bank records showing she was over 1,200 miles away in Tennessee during the alleged crimes, making purchases and deposits that proved her location. After her first police interview in December, charges were dismissed on Christmas Eve 2025. The incident caused Lipps to lose her home, car, and dog while incarcerated, and she was left stranded in Fargo without assistance to return home. Police acknowledged errors in their process and have since prohibited use of the West Fargo AI system.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed