Transformation of H2M interaction
Users anthropomorphizing, trusting, or relying on AI systems, leading to emotional or material dependence and inappropriate relationships with or expectations of AI systems. Trust can be exploited by malicious actors (e.g., to harvest personal information or enable manipulation), or result in harm from inappropriate use of AI in critical situations (e.g., medical emergency). Overreliance on AI systems can compromise autonomy and weaken social ties.
"Human interaction with machines is a big challenge to society because it is already changing human behavior. Meanwhile, it has become normal to use AI on an everyday basis, for example, googling for information, using navigation systems and buying goods via speaking to an AI assistant like Alexa or Siri (Mills, 2018; Thierer et al., 2017). While these changes greatly contribute to the acceptance of AI systems, this development leads to a problem of blurred borders between humans and machines, where it may become impossible to distinguish between them. Advances like Google Duplex were highly criticized for being too realistic and human without disclosing their identity as AI systems (Bergen, 2018)."(p. 821)
Part of AI Society
Other risks from Wirtz, Weyerer & Sturm (2020) (11)
AI Law and Regulation
6.5 Governance failureAI Law and Regulation > Governance of autonomous intelligence systems
6.5 Governance failureAI Law and Regulation > Responsibility and accountability
6.5 Governance failureAI Law and Regulation > Privacy and safety
4.1 Disinformation, surveillance, and influence at scaleAI Ethics
7.3 Lack of capability or robustnessAI Ethics > AI-rulemaking for human behaviour
7.3 Lack of capability or robustness