Skip to main content
Home/Risks/IBM2025/Data bias

Data bias

Sub-category
Risk Domain

Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.

"Historical and societal biases that are present in the data are used to train and fine-tune the model."

Supporting Evidence (1)

1.
"Training an AI system on data with bias, such as historical or societal bias, can lead to biased or skewed outputs that can unfairly represent or otherwise discriminate against certain groups or individuals."

Other risks from IBM2025 (63)