Honda's Collision Mitigation Braking System (CMBS) AI falsely triggered automatic emergency braking in multiple vehicles, causing unexpected deceleration from 45 to 20 mph without driver input, resulting in whiplash injuries and prompting class-action lawsuits.
Honda's Collision Mitigation Braking System (CMBS) uses millimeter wave radar and cameras to scan 300 feet ahead for collision risks and automatically applies brakes when obstacles are detected. Multiple drivers reported the system falsely triggered, causing sudden braking from 45 mph to 20 mph without warning when no obstacles were present. One family member experienced whiplash from such an incident. A consolidated class-action lawsuit in California federal court alleges Honda violated unfair business practices laws by failing to disclose known defects. Plaintiffs report the system regularly malfunctions due to opposing traffic, slamming brakes in intersections and straightaways with no vehicles in their lane. The CMBS software makes braking decisions based on programmers' choices about collision avoidance. Government pressure from NHTSA, which cited that 94% of crashes are due to human error, has pushed automakers to deploy such AI systems despite imperfections. The legal system creates conflicting incentives where manufacturers face liability both for installing and not installing collision avoidance technology.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed