Lockport School District's facial recognition system misidentified Black students at higher rates and mistakenly flagged objects like broom handles as weapons, raising concerns about biased policing responses.
The Lockport City School District in upstate New York deployed the AEGIS facial recognition and weapons detection system by SN Technologies in January 2020 after years of controversy. Documents obtained by Motherboard revealed that SN Technologies misled the district about the system's accuracy and racial bias rates. The company claimed their algorithm ranked 49th out of 139 in NIST racial bias tests, but NIST scientist Patrick Grother confirmed these claims were false. An independent audit by Freed Maxick found the system actually misidentified Black men four times more often and Black women 16 times more often than white men, worse than the company's claims of 2x and 10x respectively. The system also frequently generated false weapon alerts, including mistaking broom handles for guns. Lockport schools are 11% Black, and parents worried about biased facial recognition triggering armed police responses. The system was designed to alert police when detecting weapons or flagged individuals, though SN Technologies claims human verification is required. The COVID-19 pandemic rendered the facial recognition largely useless due to mask requirements.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Accuracy and effectiveness of AI decisions and actions are dependent on group membership, where decisions in AI system design and biased training data lead to unequal outcomes, reduced benefits, increased effort, and alienation of users.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed