The Israeli military developed and deployed an AI system called Lavender that identified over 37,000 Palestinians as potential targets for assassination, leading to widespread civilian casualties when the military relied on the system's recommendations with minimal human oversight during the Gaza war.
In 2023, the Israeli Defense Forces deployed an AI-powered targeting system called Lavender during military operations in Gaza following the October 7 Hamas attack. The system was designed to identify suspected members of Hamas and Palestinian Islamic Jihad military wings by analyzing surveillance data on most of Gaza's 2.3 million residents. According to investigative reporting by +972 Magazine and Local Call based on six Israeli intelligence officers, Lavender marked approximately 37,000 Palestinians as potential targets for assassination. The system had a known 10% error rate, yet human operators typically spent only 20 seconds reviewing each target recommendation, primarily checking if the target was male. The military authorized killing up to 15-20 civilians for each suspected junior militant and over 100 civilians for senior commanders. During the early weeks of the war, the system was used to conduct systematic bombings of family homes at night using unguided 'dumb' bombs. A companion system called 'Where's Daddy?' tracked targets to their residences. According to Palestinian Health Ministry data, approximately 15,000 Palestinians were killed in the first six weeks of the war, with entire families wiped out in their homes.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to develop cyber weapons (e.g., by coding cheaper, more effective malware), develop new or enhance existing weapons (e.g., Lethal Autonomous Weapons or chemical, biological, radiological, nuclear, and high-yield explosives), or use weapons to cause mass harm.
AI system
Due to a decision or action made by an AI system
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed