Skip to main content
Home/Risks/IBM2025/Decision bias

Decision bias

Sub-category
Risk Domain

Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.

"Decision bias occurs when one group is unfairly advantaged over another due to decisions of the model. This might be caused by biases in the data and also amplified as a result of the model’s training."

Supporting Evidence (1)

1.
"Bias can harm persons affected by the decisions of the model."

Other risks from IBM2025 (63)