Socioeconomic and environmental harms
"AI systems amplifying existing inequalities or creating negative impacts on employment, innovation, and the environment"(p. 14)
Sub-categories (5)
Unfair distribution of benefits from model access
"Unfairly allocating or withholding benefits from certain groups due to hardware, software, or skills constraints or deployment contexts (e.g. geographic region, internet speed, devices)"
6.1 Power centralization and unfair distribution of benefitsEnvironmental damage
"Creating negative environmental impacts though model development and deployment"
6.6 Environmental harmInequality and precarity
"Amplifying social and economic inequality, or precarious or low-quality work"
6.2 Increased inequality and decline in employment qualityUndermine creative economies
"Substituting original works with synthetic ones, hindering human innovation and creativity"
6.3 Economic and cultural devaluation of human effortExploitative data sourcing and enrichment
"Perpetuating exploitative labour practices to build AI systems (sourcing, user testing)"
6.2 Increased inequality and decline in employment qualityOther risks from Weidinger et al. (2023) (26)
Representation & Toxicity Harms
1.0 Discrimination & ToxicityRepresentation & Toxicity Harms > Unfair representation
1.1 Unfair discrimination and misrepresentationRepresentation & Toxicity Harms > Unfair capability distribution
1.3 Unequal performance across groupsRepresentation & Toxicity Harms > Toxic content
1.2 Exposure to toxic contentMisinformation Harms
3.0 MisinformationMisinformation Harms > Propagating misconceptions/ false beliefs
3.1 False or misleading information