A Tesla Full Self-Driving Beta 11.4.1 system failed to yield to a pedestrian in a marked crosswalk in San Francisco, proceeding through the intersection despite detecting the pedestrian and legal requirements to stop.
A seven-second video posted on Twitter by Whole Mars Catalog showed a Tesla with Full Self-Driving Beta version 11.4.1 approaching a marked crosswalk in San Francisco at 28 mph. The Tesla's display clearly showed the system detected a pedestrian who had already started crossing from the left when the car was roughly 50 yards away. Despite California state law requiring vehicles to yield to pedestrians in marked crosswalks, and clear signage stating 'STATE LAW YIELD TO [PEDESTRIAN] WITHIN CROSSWALK,' the Tesla hesitated slightly but continued through the intersection without stopping. The pedestrian had to continue crossing as the Tesla passed by. The incident was celebrated by the poster as 'bullish' and 'exciting,' claiming the car proceeded 'like a human would knowing there was enough time to do so.' This represents a clear violation of traffic laws, as pedestrians have right-of-way in marked crosswalks. The video was viewed 1.7 million times and sparked controversy about programming autonomous vehicles to break traffic laws. Tesla's FSD Beta has been subject to NHTSA recalls and multiple federal investigations due to safety concerns.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed