Waymo's self-driving vehicles illegally passed school buses with stop arms deployed 19 times since the start of the school year in Texas, including incidents where students were still in the road.
The National Highway Traffic Safety Administration (NHTSA) opened a probe in October after a Waymo self-driving car failed to remain stationary when approaching a school bus with red lights flashing and stop arm deployed in Georgia. The Austin Independent School District reported that Waymo vehicles had illegally passed school buses 19 times since the start of the school year. Five additional incidents occurred in November after Waymo claimed to have implemented software updates to resolve the issue. One particularly concerning incident involved a Waymo vehicle driving past a stopped school bus moments after a student crossed in front of the vehicle while the student was still in the road. The school district requested that Waymo halt operations around schools during pick-up and drop-off times until the vehicles could ensure compliance with traffic laws. Waymo refused this request and another incident occurred on December 1, indicating the software updates had not resolved the safety concerns. NHTSA demanded answers from Waymo by January 20 about the incidents and details of software updates to address the safety issues.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed