Brian Hofer and his brother were detained at gunpoint by police after an automated license plate reader system incorrectly flagged their rental car as stolen due to outdated database information.
On November 25, Brian Hofer and his brother were driving home from Thanksgiving when they were pulled over by Contra Costa County sheriff's deputies near Hercules, California. A fixed stationary camera operated by Vigilant Solutions had scanned their license plate and matched it to a 'hot list' registry of stolen vehicles. The rental car from Getaround had indeed been stolen from San Jose in October, but the database had not been properly updated to show the car was no longer stolen after recovery. During the 40-minute detention, both men were handcuffed and held at gunpoint while police verified their identity through the Getaround app. The Northern California Regional Intelligence Center estimates that automated license plate reader systems have a 10 percent error rate. Hofer, who chairs Oakland's Privacy Advisory Commission, subsequently filed a federal lawsuit against the sheriff's department alleging civil rights violations, warrantless search, and excessive force. The incident highlights broader concerns about automated policing systems where outdated or incorrect data can lead to dangerous confrontations.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed