Tesla's Autopilot system was active when a Model 3 crashed into a tractor-trailer truck in Florida, killing the driver Jeremy Beren Banner in the fourth fatal crash involving Tesla's advanced driver assistance system.
On March 1st, a Tesla Model 3 equipped with Autopilot crashed into the side of a tractor-trailer truck in Florida, killing the 50-year-old driver Jeremy Beren Banner. According to the National Transportation Safety Board (NTSB), investigators found that neither the driver nor the Autopilot system executed evasive maneuvers before the collision. The driver had engaged Autopilot approximately 10 seconds before the crash, and for the final 8 seconds before impact, the vehicle did not detect the driver's hands on the steering wheel. The Model 3 was traveling at 68 mph when it struck the truck, with the roof being sheared off as the vehicle passed underneath the trailer before stopping 1,600 feet away. This incident marks at least the fourth fatal crash involving Tesla's Autopilot system. The crash bears similarities to a 2016 incident near Gainesville, Florida, where Joshua Brown was killed in a similar collision with a semitrailer truck. Tesla confirmed the sequence of events and noted that drivers have logged over one billion miles with Autopilot engaged, stating that when used properly by attentive drivers, those supported by Autopilot are statistically safer than those operating without assistance.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed