Widespread use of persuasive tools contributes to splintered epistemic communities
A Survey of the Potential Long-term Impacts of AI: How AI Could Lead to Long-term Changes in Science, Cooperation, Power, Epistemics and Values
Highly personalized AI-generated misinformation creating “filter bubbles” where individuals only see what matches their existing beliefs, undermining shared reality, weakening social cohesion and political processes.
"Even without deliberate misuse, widespread use of powerful persuasion tools could have negative impacts. If such tools were used by many different groups to advance many different ideas, we could see the world splintering into isolated “epistemic communities”, with little room for dialogue or transfer between communities. A similar scenario could emerge via the increasing personalisation of people’s online experiences—in other words, we may see a continuation of the trend towards “filter bubbles” and “echo chambers”, driven by content selection algorithms, that some argue is already happening [3, 25, 51]."(p. 6)
Part of Worsened epistemic processes for society
Other risks from Clarke2023 (19)
Worsened conflict
6.4 Competitive dynamicsWorsened conflict > AI enables development of weapons of mass destruction
4.2 Cyberattacks, weapon development or use, and mass harmWorsened conflict > AI enables automation of military decision-making
5.2 Loss of human agency and autonomyWorsened conflict > AI-induced strategic instability
5.2 Loss of human agency and autonomyWorsened conflict > Resource conflicts driven by AI development
6.4 Competitive dynamicsIncreased power concentration and inequality
6.1 Power centralization and unfair distribution of benefits