Skip to main content
Home/Risks/Shelby et al. (2023)/Societal System Harms

Societal System Harms

Sociotechnical Harms of Algorithmic Systems: Scoping a Taxonomy for Harm Reduction

Shelby et al. (2023)

Category

"Social system or societal harms reflect the adverse macro-level effects of new and reconfigurable algorithmic systems, such as systematizing bias and inequality [84] and accelerating the scale of harm [137]"(p. 731)

Sub-categories (5)

Information harms

information-based harms capture concerns of misinformation, disinformation, and malinformation. Algorithmic systems, especially generative models and recommender, systems can lead to these information harms

3.1 False or misleading information
OtherUnintentionalPost-deployment

Cultural harms

Cultural harm has been described as the development or use of algorithmic systems that affects cultural stability and safety, such as “loss of communication means, loss of cultural property, and harm to social values”

5.2 Loss of human agency and autonomy
AI systemUnintentionalPost-deployment

Civic and political harms

Political harms emerge when “people are disenfranchised and deprived of appropriate political power and influence” [186, p. 162]. These harms focus on the domain of government, and focus on how algorithmic systems govern through individualized nudges or micro-directives [187], that may destabilize governance systems, erode human rights, be used as weapons of war [188], and enact surveillant regimes that disproportionately target and harm people of color

4.1 Disinformation, surveillance, and influence at scale
OtherIntentionalPost-deployment

Labor & material/Macro-socio economic harms

Algorithmic systems can increase “power imbalances in socio-economic relations” at the societal level [4, 137, p. 182], including through exacerbating digital divides and entrenching systemic inequalities [114, 230]. The development of algorithmic systems may tap into and foster forms of labor exploitation [77, 148], such as unethical data collection, worsening worker conditions [26], or lead to technological unemployment [52], such as deskilling or devaluing human labor [170]... when algorithmic financial systems fail at scale, these can lead to “flash crashes” and other adverse incidents with widespread impacts

6.1 Power centralization and unfair distribution of benefits
OtherOtherPost-deployment

Environmental harms

depletion or contamination of natural resources, and damage to built environments... that may occur throughout the lifecycle of digital technologies [170, 237] from “crale (mining) to usage (consumption) to grave (waste)”

6.6 Environmental harm
AI systemUnintentionalPost-deployment

Other risks from Shelby et al. (2023) (24)