Tesla's Autopilot system misidentified a horse-drawn carriage as various vehicles including a truck, car, and human, displaying incorrect visualizations on the dashboard screen.
A Tesla Model Y driver in Switzerland captured video of the vehicle's Autopilot system failing to correctly identify a horse-drawn carriage on a highway outside Zurich. The Tesla's visualization system displayed multiple incorrect identifications of the carriage, showing it variously as a motorcycle, car, truck, and human figure. At one point, the system showed the 'truck' spinning around as if destined for a head-on collision with the Tesla. The video was posted by Swiss YouTuber RealRusty and gained over 600,000 views on Instagram. Tesla's Autopilot function is designed to detect vehicles, pedestrians, and other objects in the car's vicinity and take evasive action if needed. The incident highlights challenges with AI systems processing out-of-distribution data that may not have been included in training datasets. No actual collision or harm occurred during the incident.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed