A Tesla Model Y in Autopilot mode struck a 17-year-old student getting off a school bus in North Carolina at 45 mph, causing severe injuries including a fractured neck and broken leg.
In March 2023, a Tesla Model Y allegedly operating in Autopilot mode struck 17-year-old Tillman Mitchell as he exited a school bus on North Carolina Highway 561 in Halifax County. The school bus was displaying its stop sign and flashing red warning lights when Mitchell stepped off and was hit at 45 mph. The impact threw Mitchell into the windshield, then into the air, before he landed facedown on the road. Mitchell suffered life-threatening injuries including a fractured neck, broken leg, and required placement on a ventilator. He continues to experience memory problems and difficulty walking. The driver, Dr. Howard Gene Yee, was charged with reckless driving and passing a stopped school bus striking a person. Authorities discovered Yee had attached weights to the steering wheel to circumvent Autopilot's safety feature requiring driver attention. This incident is part of a broader pattern identified by The Washington Post analysis showing 736 crashes involving Tesla Autopilot since 2019, including 17 fatal incidents. NHTSA is investigating the crash as part of its ongoing examination of Tesla's driver-assistance technology.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed