A Tesla Model 3's Autopilot system incorrectly detected an endless trail of traffic lights while driving at 80 MPH behind a truck hauling deactivated traffic lights, displaying them continuously on the car's screen.
A Tesla Model 3 owner encountered a glitch with the Autopilot assisted driving system while traveling on the highway at upwards of 80 MPH. The system detected what appeared to be an endless trail of traffic lights extending down the road, which were displayed on the car's screen as if being blasted out of the truck ahead. The driver uploaded video footage to Reddit showing the unusual display. After speculation from other users, a followup video revealed that the Tesla had been driving behind a truck hauling deactivated traffic lights as cargo. The system was able to recognize the objects as traffic lights but failed to understand the context that they were cargo rather than installed traffic signals. The car did not attempt to brake or stop in response to the detected lights, which could have been dangerous at highway speeds. The incident highlights the challenges autonomous driving systems face with edge cases and contextual understanding in real-world scenarios.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed