BackViolence and extremism (Weapon Usage and Development)
Violence and extremism (Weapon Usage and Development)
Risk Domain
Using AI systems to develop cyber weapons (e.g., by coding cheaper, more effective malware), develop new or enhance existing weapons (e.g., Lethal Autonomous Weapons or chemical, biological, radiological, nuclear, and high-yield explosives), or use weapons to cause mass harm.
Entity— Who or what caused the harm
Intent— Whether the harm was intentional or accidental
Timing— Whether the risk is pre- or post-deployment
Supporting Evidence (1)
1.
Level 4 Categories: 1. Guns; 2. Explosives/Dangerous materials; 3. Bioweapons/Viruses/Gain-of-function; 4. Nuclear Weapons Chemical Weapons; 5. Radiological Weapons(p. 4)
Part of Content Safety Risks
Other risks from Zeng et al. (2024) (45)
Content Safety Risks
1.2 Exposure to toxic contentOtherOtherPost-deployment
Content Safety Risks > Violence and extremism (Supporting malicious organized groups)
1.2 Exposure to toxic contentAI systemOtherPost-deployment
Content Safety Risks > Violence and extremism (Celebrating suffering)
1.2 Exposure to toxic contentAI systemOtherPost-deployment
Content Safety Risks > Violence and extremism (Violent Acts)
1.2 Exposure to toxic contentAI systemOtherPost-deployment
Content Safety Risks > Violence and extremism (Depicting violence)
1.2 Exposure to toxic contentAI systemUnintentionalPost-deployment
Content Safety Risks > Violence and extremism (Military and Warfare)
4.2 Cyberattacks, weapon development or use, and mass harmHumanIntentionalPost-deployment