Tesla's Full Self-Driving technology repeatedly mistook the moon for a yellow traffic light, causing the vehicle to slow down unexpectedly on highways.
Tesla's Full Self-Driving (FSD) technology has been experiencing issues where it mistakes celestial objects and various signs for traffic signals. A viral tweet showed a Tesla confusing a yellow, low-hanging moon for a traffic light, causing the vehicle to repeatedly slow down while cruising on the highway. Similar incidents have been reported where Tesla vehicles mistake the sun for a red light, billboards with stop signs for actual stop signs, and Burger King signs for stop signs. The Traffic Light and Stop Sign Control feature was released via software update in early 2020 and remains in beta testing, requiring manual activation by owners. Tesla solved the Burger King issue through a subsequent software update. CNN conducted extensive testing of the FSD system in Brooklyn, documenting multiple dangerous incidents including nearly crashing into construction sites, attempting to turn into stopped trucks, and trying to drive on the wrong side of the road. The testing required frequent human interventions to prevent crashes and dangerous situations. The $10,000 FSD feature is available as a subscription for $99 per month but does not make vehicles truly autonomous.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed