Skip to main content
Home/Risks/Shelby et al. (2023)/Stereotyping social groups

Stereotyping social groups

Sociotechnical Harms of Algorithmic Systems: Scoping a Taxonomy for Harm Reduction

Shelby et al. (2023)

Sub-category
Risk Domain

Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.

Stereotyping in an algorithmic system refers to how the system’s outputs reflect “beliefs about the characteristics, attributes, and behaviors of members of certain groups....and about how and why certain attributes go together"(p. 728)

Part of Representational Harms

Other risks from Shelby et al. (2023) (24)