BackLack of accountability and liability
Lack of accountability and liability
Risk Domain
Inadequate regulatory frameworks and oversight mechanisms that fail to keep pace with AI development, leading to ineffective governance and the inability to manage AI risks appropriately.
"Determining responsibility when EAI causes harm requires new accountability and liability frameworks that address the complexities of highly autonomous physical systems. Human users may disagree with decisions taken by expert EAI systems, raising significant questions of delegation and responsibility [108]. Lack of EAI accountability could lead to confusion for users and breakdowns in traditional justice systems [109]."(p. 6)
Entity— Who or what caused the harm
Intent— Whether the harm was intentional or accidental
Timing— Whether the risk is pre- or post-deployment
Supporting Evidence (1)
1.
"For example, we may soon need to consider who to blame and how to collect damages when a highly autonomous robotic surgeon removes a healthy organ by mistake [110]. Although virtual AI applications also raise liability concerns, EAI’s ability to cause physical damage underscores the importance of establishing robust liability regimes, as liability is crucial in remedying physical harms."(p. 6)
Other risks from Perlo et al. (2025) (12)
Economic Risks
6.0 Socioeconomic & EnvironmentalNot codedNot codedNot coded
Purposeful or malicious harm
4.2 Cyberattacks, weapon development or use, and mass harmHumanIntentionalPost-deployment
Accidental harm
OtherUnintentionalPost-deployment
Privacy Violations
2.1 Compromise of privacy by leaking or correctly inferring sensitive informationAI systemUnintentionalPost-deployment
Misinformation
3.1 False or misleading informationAI systemUnintentionalPost-deployment
Labour Displacement
6.2 Increased inequality and decline in employment qualityAI systemUnintentionalPost-deployment