ShotSpotter gunshot detection systems deployed in multiple US cities generated thousands of false alerts that led police to respond to non-existent threats, disproportionately targeting Black and Latinx communities with unnecessary armed police deployments.
ShotSpotter is an AI-powered gunshot detection system that uses microphones and audio analysis software to identify potential gunshots and alert police. Studies examined ShotSpotter deployments across multiple cities including Chicago, Kansas City, Cleveland, Atlanta, and Syracuse from 2019-2022. In Chicago alone, analysis of 21 months of data showed that 89% of over 40,000 ShotSpotter alerts turned up no gun-related crime and 86% led to no report of any crime at all. The system, which costs between $65,000-$95,000 per square mile annually, was deployed almost exclusively in majority Black and Latinx neighborhoods across all studied cities. Chicago spent $33 million on a three-year contract, while Syracuse spent nearly $400,000 annually. The false alerts created dangerous situations by sending armed police expecting active gunfire into communities where residents posed no threat. In one case, a ShotSpotter alert led to the fatal shooting of 13-year-old Adam Toledo in Chicago. The system frequently misidentified sounds like fireworks, car backfires, and construction noise as gunshots, with human technicians at ShotSpotter having only 60 seconds to verify alerts before they were sent to police.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed