A Tesla Model Y in Full Self-Driving beta mode crashed on November 3rd in Brea, California when the AI system forced the vehicle into the wrong lane during a left turn despite the driver's attempt to correct it, resulting in a collision with another vehicle and severe damage to the Tesla.
On November 3rd, a Tesla Model Y operating in Full Self-Driving (FSD) beta mode crashed in Brea, a city southeast of Los Angeles. The incident was reported to the National Highway Traffic Safety Administration by the vehicle owner. According to the report, while making a left turn, the FSD system guided the car into the wrong lane. The driver received an alert halfway through the turn and attempted to correct the steering to avoid going into the wrong lane, but the car took control and forced itself into the incorrect lane, creating an unsafe maneuver. This resulted in a collision with another driver in the adjacent lane. No one was injured in the crash, but the Tesla Model Y was severely damaged on the driver's side. Tesla's FSD is classified as a Level 2 automated driving system under the Society of Automotive Engineers' taxonomy, requiring drivers to maintain vigilance with eyes on the road and hands on the steering wheel. NHTSA confirmed they are investigating the claim, and this incident is likely the first reported crash involving Tesla's controversial FSD beta feature.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed