Responsibility and accountability
Inadequate regulatory frameworks and oversight mechanisms that fail to keep pace with AI development, leading to ineffective governance and the inability to manage AI risks appropriately.
"The challenge of responsibility and accountability is an important concept for the process of governance and regulation. It addresses the question of who is to be held legally responsible for the actions and decisions of AI algorithms. Although humans operate AI systems, questions of legal responsibility and liability arise. Due to the self-learning ability of AI algorithms, the operators or developers cannot predict all actions and results. Therefore, a careful assessment of the actors and a regulation for transparent and explainable AI systems is necessary (Helbing et al., 2017; Wachter et al., 2017)"(p. 820)
Part of AI Law and Regulation
Other risks from Wirtz, Weyerer & Sturm (2020) (11)
AI Law and Regulation
6.5 Governance failureAI Law and Regulation > Governance of autonomous intelligence systems
6.5 Governance failureAI Law and Regulation > Privacy and safety
4.1 Disinformation, surveillance, and influence at scaleAI Ethics
7.3 Lack of capability or robustnessAI Ethics > AI-rulemaking for human behaviour
7.3 Lack of capability or robustnessAI Ethics > Compatibility of AI vs. human value judgement
7.3 Lack of capability or robustness