Facial recognition systems used by UK retailers and police wrongly identified innocent people as criminals, leading to false accusations, detentions, and emotional distress for those affected.
Multiple facial recognition systems caused misidentification incidents in the UK. The Facewatch system, used by retailers including Home Bargains, Budgens, Sports Direct and Costcutter, incorrectly flagged an innocent woman as a shoplifter, leading to her being accused, searched, escorted from the store, and banned from all stores using the technology. Facewatch later acknowledged the error in writing. The Metropolitan Police's live facial recognition system, deployed in modified vans with roof-mounted cameras, incorrectly identified Shaun Thompson near London Bridge as a wanted suspect. Thompson was detained for 20 minutes, asked for fingerprints, and only released after providing his passport. The mistake may have been due to family resemblance. The Met reports using the technology 67 times in 2024, with one in 40 alerts being false positives, though they claim only one in 33,000 people scanned are misidentified. The system has resulted in 192 arrests this year, including individuals wanted for sexual offenses, assault, and other crimes.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Accuracy and effectiveness of AI decisions and actions are dependent on group membership, where decisions in AI system design and biased training data lead to unequal outcomes, reduced benefits, increased effort, and alienation of users.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed