Discriminative data bias
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
"Discriminative data bias describes the systematic discrimination of groups of persons in the form of data shortcomings, such as distributional representation or incorrectness. Data bias can manifest in the model and lead to unfair decisions if not appropriately treated. Note, that the term bias is often used in other contexts, such as data representation. However, these issues are treated by other AI hazards in this list."(p. 9)
Other risks from Schnitzer2024 (24)
Inadequate specification of ODD
7.3 Lack of capability or robustnessInappropriate degree of automation
7.2 AI possessing dangerous capabilitiesInadequate planning of performance requirements
7.3 Lack of capability or robustnessInsufficient AI development documentation
7.4 Lack of transparency or interpretabilityInappropriate degree of transparency to end users
7.4 Lack of transparency or interpretabilityChoice of untrustworthy data source
7.0 AI System Safety, Failures & Limitations