Worsened epistemic processes for society
A Survey of the Potential Long-term Impacts of AI: How AI Could Lead to Long-term Changes in Science, Cooperation, Power, Epistemics and Values
Highly personalized AI-generated misinformation creating “filter bubbles” where individuals only see what matches their existing beliefs, undermining shared reality, weakening social cohesion and political processes.
"Epistemic processes and problem solving: we currently see more reasons to be concerned about AI worsening society's epistemic processes than reasons to be optimistic about AI helping us better solve problems as a society. For example, increased use of content selection algorithms could drive epistemic insularity and a decline in trust in credible multipartisan sources, which reducing our ability to deal with important long-term threats and challenges such as pandemics and climate change."(p. 9)
Sub-categories (5)
AI contributes to increased online polarisation
"One of the most significant commercial uses of current AI systems is in the content recommendation algorithms of social media companies, and there are already concerns that this is contributing to worsened polarisation online"
3.2 Pollution of information ecosystem and loss of consensus realityAI is used to scale up production of false and misleading information
"At the same time, we are seeing how AI can be used to scale up the production of convincing yet false or misleading information online (e.g. via image, audio, and text synthesis models like BigGAN [6] and GPT-3 [7])."
4.1 Disinformation, surveillance, and influence at scaleAI's persuasive capabilities are misused to gain influence and promote harmful ideologies
"As AI capabilities advance, they may be used to develop sophisticated persuasion tools, such as those that tailor their communication to specific users to persuade them of certain claims [42]. While these tools could be used for social good— such as New York Times’ chatbot that helps users to persuade people to get vaccinated against Covid-19 [27]—there are also many ways they could be misused by self-interested groups to gain influence and/or to promote harmful ideologies."
4.1 Disinformation, surveillance, and influence at scaleWidespread use of persuasive tools contributes to splintered epistemic communities
"Even without deliberate misuse, widespread use of powerful persuasion tools could have negative impacts. If such tools were used by many different groups to advance many different ideas, we could see the world splintering into isolated “epistemic communities”, with little room for dialogue or transfer between communities. A similar scenario could emerge via the increasing personalisation of people’s online experiences—in other words, we may see a continuation of the trend towards “filter bubbles” and “echo chambers”, driven by content selection algorithms, that some argue is already happening [3, 25, 51]."
3.2 Pollution of information ecosystem and loss of consensus realityReduced decision-making capacity as a result of decreased trust in information
"In addition, the increased awareness of these trends in information production and distribution could make it harder for anyone to evaluate the trustworthiness of any information source, reducing overall trust in information. In all of these scenarios, it would be much harder for humanity to make good decisions on important issues, particularly due to declining trust in credible multipartisan sources, which could hamper attempts at cooperation and collective action. The vaccine and mask hesitancy that exacerbated Covid-19, for example, were likely the result of insufficient trust in public health advice [71]. These concerns could be especially worrying if they play out during another major world crisis. We could imagine an even more virulent pandemic, where actors exploit the opportunity to spread misinformation and disinformation to further their own ends. This could lead to dangerous practices, a significantly increased burden on health services, and much more catastrophic outcomes [64]."
3.2 Pollution of information ecosystem and loss of consensus realityOther risks from Clarke2023 (19)
Worsened conflict
6.4 Competitive dynamicsWorsened conflict > AI enables development of weapons of mass destruction
4.2 Cyberattacks, weapon development or use, and mass harmWorsened conflict > AI enables automation of military decision-making
5.2 Loss of human agency and autonomyWorsened conflict > AI-induced strategic instability
5.2 Loss of human agency and autonomyWorsened conflict > Resource conflicts driven by AI development
6.4 Competitive dynamicsIncreased power concentration and inequality
6.1 Power centralization and unfair distribution of benefits