A Waymo autonomous vehicle struck a child who ran across the street from behind a parked SUV near a Santa Monica elementary school during drop-off hours on January 23, 2025, causing minor injuries.
On January 23, 2025, a Waymo autonomous vehicle struck a child near an elementary school in Santa Monica, California during normal school drop-off hours. The incident occurred within two blocks of the school while other children, a crossing guard, and several double-parked vehicles were nearby. The child ran across the street from behind a double-parked SUV towards the school and was struck by the Waymo vehicle, which was operating on Waymo's 5th Generation Automated Driving System with no human safety supervisor present. The child sustained minor injuries. According to Waymo, their technology immediately detected the individual as they emerged from behind the stopped vehicle and braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made. The company stated that a fully attentive human driver would likely have made contact at 14 mph. After the collision, the child stood up immediately and walked to the sidewalk, and Waymo called 911. The National Highway Traffic Safety Administration has opened a preliminary evaluation to investigate the incident, focusing on whether the Waymo vehicle exercised appropriate caution given its proximity to the elementary school and the presence of young pedestrians.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed