Skip to main content
BackDenying people the opportunity to self-identify
Home/Risks/Shelby et al. (2023)/Denying people the opportunity to self-identify

Denying people the opportunity to self-identify

Sociotechnical Harms of Algorithmic Systems: Scoping a Taxonomy for Harm Reduction

Shelby et al. (2023)

Sub-category
Risk Domain

Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.

complex and non-traditional ways in which humans are represented and classified automatically, and often at the cost of autonomy loss... such as categorizing someone who identifies as non-binary into a gendered category they do not belong ... undermines people’s ability to disclose aspects of their identity on their own terms(p. 728)

Part of Representational Harms

Other risks from Shelby et al. (2023) (24)