A Turkish-made autonomous drone called the STM Kargu-2 reportedly used AI to independently hunt down and attack retreating soldiers in Libya's civil war without requiring human control.
In March 2020, during Libya's civil war, a Turkish-made autonomous weapon system called the STM Kargu-2 drone was reportedly used to hunt down and attack retreating soldiers loyal to General Khalifa Haftar. According to a UN Panel of Experts report released in March 2021, these drones were deployed by forces supporting the UN-recognized Government of National Accord. The Kargu-2 is described as a lethal autonomous weapons system that uses machine learning and real-time image processing for target identification. The UN report states that these weapons were 'programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true fire, forget and find capability.' The drones reportedly engaged logistics convoys and retreating forces autonomously. While the report heavily implies casualties occurred, it does not explicitly confirm deaths or provide specific casualty figures. This incident is significant as it may represent the first known case of AI-powered autonomous weapons being used to kill humans in combat, marking a new chapter in autonomous warfare.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to develop cyber weapons (e.g., by coding cheaper, more effective malware), develop new or enhance existing weapons (e.g., Lethal Autonomous Weapons or chemical, biological, radiological, nuclear, and high-yield explosives), or use weapons to cause mass harm.
AI system
Due to a decision or action made by an AI system
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed