Amazon deployed AI-powered Netradyne cameras in delivery vans that incorrectly penalized drivers for events beyond their control, affecting driver performance scores and company bonuses.
In February 2021, Amazon began installing AI-powered cameras made by Netradyne in its delivery vans across the United States, with over half the fleet equipped by the time of reporting. The cameras have four lenses that record drivers when they detect events such as following too closely, stop sign violations, and distracted driving. Six Amazon delivery drivers across California, Texas, Kansas, Alabama, and Oklahoma reported that the cameras regularly punish drivers for events beyond their control, including getting cut off by other cars, looking at side mirrors, or stopping after stop signs to see around obstacles. The system affects weekly performance scores that determine driver bonuses and delivery company payments from Amazon. Amazon claims the technology has reduced accidents by 48 percent and various violations by 50-77 percent. However, drivers report frequent false positives, with the cameras incorrectly flagging yield signs as stop signs, penalizing safe driving behaviors, and creating a burdensome appeal process that rarely results in overturned decisions. The scoring system impacts delivery companies' weekly bonuses from Amazon, with safety metrics comprising 40 percent of their scorecard, and companies losing thousands of dollars due to erroneous violations.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed