A facial recognition AI system used by Bahia police incorrectly identified an innocent Black administrative assistant named Davi as a wanted criminal, leading to his surveillance across 15 metro stations and subsequent police detention.
In Salvador, Bahia, Brazil, police deployed a facial recognition AI system supplied by Spanish company Iecisa in partnership with Huawei for R$ 18 million. The system compares faces captured by cameras against a database of wanted persons maintained by the Public Security Secretariat. When the algorithm identifies 90% similarity, it triggers an alert for human analysis and potential police action. In this incident, the system incorrectly identified Davi, a Black administrative assistant, as matching a wanted person while he passed through Lapa metro station. The AI tracked him for 15 stations over 22 kilometers from Lapa to Mussurunga before police approached him at a bus stop. After checking his ID, police confirmed he was not the person the system identified and released him. The system has a very low accuracy rate - only 3.6% of 903 alerts at the 2019 Feira de Santana Micareta festival resulted in actual arrests. Despite this, Bahia government plans to expand the system to 77 cities with 4,095 additional cameras at a cost of R$ 665 million. Research shows facial recognition systems disproportionately misidentify Black people, particularly concerning in Bahia where 97% of police violence victims are Black and the state has the highest percentage of Black residents in Brazil.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed