Facial recognition systems used by UK retailers and police incorrectly identified innocent individuals as criminals, leading to false accusations, wrongful detentions, and public humiliation.
The incident involves facial recognition technology deployed by UK retailers and police that falsely identified several innocent individuals. Sara was wrongly accused of being a thief by a Facewatch facial recognition system at a Home Bargains store, leading to her bag being searched and being banned from all stores using the technology before Facewatch acknowledged the error. Shaun Thompson was incorrectly identified by Metropolitan Police facial recognition cameras near London Bridge and detained for 20 minutes before proving his identity with a passport. The Metropolitan Police report that around one in every 33,000 people who walk past their cameras are misidentified, with one in 40 alerts being false positives. The technology has been used 67 times in 2024 alone, compared to 23 times in 2023 and 9 times between 2020-2022. Facewatch is used across numerous UK retailers including Budgens, Sports Direct and Costcutter.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Accuracy and effectiveness of AI decisions and actions are dependent on group membership, where decisions in AI system design and biased training data lead to unequal outcomes, reduced benefits, increased effort, and alienation of users.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed