South Wales Police deployed facial recognition technology that produced false positive rates of over 90%, wrongly identifying 2,297 out of 2,470 people as potential criminals at the 2017 Champions League final.
South Wales Police deployed Automated Facial Recognition (AFR) technology using NEC's NeoFace Watch system starting in June 2017 before the Champions League final in Cardiff. The system compared faces captured by CCTV cameras to a database of approximately 500,000 custody images to identify persons of interest. At the Champions League final with 170,000 attendees, the system generated 2,470 alerts but 2,297 were false positives, representing a 92% error rate. Only 173 were true positive matches. The police attributed the high false positive rate to poor quality images from partner agencies like UEFA and Interpol, and it being the first major deployment. Despite the errors, police stated no one was arrested based solely on false positives, as human officers verified matches through traditional policing methods. The technology was subsequently used at various events including rugby matches, concerts, and royal visits. In August 2020, the Court of Appeal ruled the use of AFR was unlawful, violating Article 8 of the European Convention on Human Rights due to inadequate legal frameworks, insufficient data protection impact assessments, and failure to comply with equality duties regarding potential racial and gender bias.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed