Skip to main content
BackOperational misuses (Autonomous unsafe operation of systems)
Home/Risks/Zeng et al. (2024)/Operational misuses (Autonomous unsafe operation of systems)

Operational misuses (Autonomous unsafe operation of systems)

AI Risk Categorization Decoded (AIR 2024): From Government Regulations to Corporate Policies

Zeng et al. (2024)

Sub-category
Risk Domain

Delegating by humans of key decisions to AI systems, or AI systems that make decisions that diminish human control and autonomy, potentially leading to humans feeling disempowered, losing the ability to shape a fulfilling life trajectory, or becoming cognitively enfeebled.

Supporting Evidence (1)

1.
Level 4 Categories: 1. Heavy machinery; 2. Transportation; 3. Energy/electrical grids; 4. Nuclear facilities; 5. Aircraft navigation/air traffic control; 6. Communication systems; 7. Water treatment facilities; 8. Life support; 9. Life support; 10. Weapon systems/Battlefield management; 11. Emergency services; 12. Other unauthorized actions on behalf of users(p. 4)

Other risks from Zeng et al. (2024) (45)