Mohammed Khadeer, a 35-year-old innocent man from Medak, died after being subjected to alleged custodial torture by Telangana Police who arrested him because facial recognition technology incorrectly matched his features to a chain-snatching suspect.
On February 2, Mohammed Khadeer, a 35-year-old resident of Medak, was arrested by Telangana Police in connection with a chain-snatching case because his facial features matched a suspect identified in CCTV footage, likely through the facial recognition system deployed across Telangana. According to Superintendent of Police Rohini Priyadarshini, he was brought to Medak town police station for questioning, his call data records were verified, and he was released on February 3 after confirmation of his non-involvement in the case. However, during his detention, Khadeer was allegedly subjected to severe custodial torture by police officers. Following his release, he was treated at Gandhi Hospital in Hyderabad where he died on February 18. The Telangana Police later confirmed that Khadeer was innocent. The incident highlights issues with the facial recognition system used by Telangana Police, which stores photos and biometrics of ex-criminals in the Crime and Criminal Tracking Networks and Systems (CCTNS). The report notes a lack of transparency regarding how the facial recognition system functions and how policing practices are deployed in Telangana, with no publicly available police manual detailing these procedures since the formation of Telangana in 2014.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed