Skip to main content
BackWidespread use of persuasive tools contributes to splintered epistemic communities
Home/Risks/Clarke2023/Widespread use of persuasive tools contributes to splintered epistemic communities

Widespread use of persuasive tools contributes to splintered epistemic communities

A Survey of the Potential Long-term Impacts of AI: How AI Could Lead to Long-term Changes in Science, Cooperation, Power, Epistemics and Values

Sub-category
Risk Domain

Highly personalized AI-generated misinformation creating “filter bubbles” where individuals only see what matches their existing beliefs, undermining shared reality, weakening social cohesion and political processes.

"Even without deliberate misuse, widespread use of powerful persuasion tools could have negative impacts. If such tools were used by many different groups to advance many different ideas, we could see the world splintering into isolated “epistemic communities”, with little room for dialogue or transfer between communities. A similar scenario could emerge via the increasing personalisation of people’s online experiences—in other words, we may see a continuation of the trend towards “filter bubbles” and “echo chambers”, driven by content selection algorithms, that some argue is already happening [3, 25, 51]."(p. 6)

Part of Worsened epistemic processes for society

Other risks from Clarke2023 (19)