Skip to main content
BackEroding trust and undermining shared knowledge
Home/Risks/Gabriel et al. (2024)/Eroding trust and undermining shared knowledge

Eroding trust and undermining shared knowledge

The Ethics of Advanced AI Assistants

Gabriel et al. (2024)

Sub-category
Risk Domain

Highly personalized AI-generated misinformation creating “filter bubbles” where individuals only see what matches their existing beliefs, undermining shared reality, weakening social cohesion and political processes.

"AI assistants may contribute to the spread of large quantities of factually inaccurate and misleading content, with negative consequences for societal trust in information sources and institutions, as individuals increasingly struggle to discern truth from falsehood."(p. 164)

Part of Misinformation risks

Other risks from Gabriel et al. (2024) (69)