Economic loss
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
Financial harms [52, 160] co-produced through algorithmic systems, especially as they relate to lived experiences of poverty and economic inequality... demonetization algorithms that parse content titles, metadata, and text, and it may penalize words with multiple meanings [51, 81], disproportionately impacting queer, trans, and creators of color [81]. Differential pricing algorithms, where people are systematically shown different prices for the same products, also leads to economic loss [55]. These algorithms may be especially sensitive to feedback loops from existing inequities related to education level, income, and race, as these inequalities are likely reflected in the criteria algorithms use to make decisions [22, 163].(p. 730)
Part of Allocative Harms
Other risks from Shelby et al. (2023) (24)
Representational Harms
1.1 Unfair discrimination and misrepresentationRepresentational Harms > Stereotyping social groups
1.1 Unfair discrimination and misrepresentationRepresentational Harms > Demeaning social groups
1.1 Unfair discrimination and misrepresentationRepresentational Harms > Erasing social groups
1.3 Unequal performance across groupsRepresentational Harms > Alienating social groups
1.1 Unfair discrimination and misrepresentationRepresentational Harms > Denying people the opportunity to self-identify
1.1 Unfair discrimination and misrepresentation