BackRisk of Injury
Risk of Injury
Risk Domain
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
"Poorly designed intelligent systems can cause moral, psychological, and physical harm. For example, the use of predictive policing tools may cause more people to be arrested or physically harmed by the police."(p. 524)
Entity— Who or what caused the harm
Intent— Whether the harm was intentional or accidental
Timing— Whether the risk is pre- or post-deployment
Other risks from Paes, Silveira & Akkari (2023) (6)
Bias and discrimination
1.1 Unfair discrimination and misrepresentationAI systemUnintentionalPost-deployment
Data Breach/Privacy & Liberty
1.1 Unfair discrimination and misrepresentationAI systemUnintentionalPost-deployment
Usurpation of jobs by automation
6.2 Increased inequality and decline in employment qualityHumanIntentionalPost-deployment
Lack of transparency
7.4 Lack of transparency or interpretabilityAI systemUnintentionalPost-deployment
Reduced Autonomy/Responsibility
5.2 Loss of human agency and autonomyOtherUnintentionalPost-deployment
Environmental Impacts
6.6 Environmental harmHumanUnintentionalPost-deployment