Google's image search algorithm systematically underrepresented women in search results for professional occupations like 'CEO', showing only 11% women versus 27% in reality, with research demonstrating this bias influenced users' perceptions of gender representation in various fields.
Researchers from the University of Washington and University of Maryland conducted a study analyzing Google's image search results for 45 different occupations in July 2013, comparing them with 2012 U.S. Bureau of Labor statistics. The study found significant gender bias in search results for certain professions. For CEO searches, only 11% of people depicted were women compared to 27% of actual U.S. CEOs being women. Similarly, only 25% of authors shown in search results were women versus 56% in reality, while 64% of telemarketers in results were female despite an even gender split in the profession. The researchers then conducted experiments showing manipulated search results to study participants and found that exposure to biased results shifted people's estimates of gender representation by an average of 7%. When searching for 'CEO' specifically, the first female result was CEO Barbie appearing many rows down, and similar patterns were observed across other search engines like Bing and Yahoo. The study revealed that images showing people matching occupational gender stereotypes were rated as more professional and trustworthy, while those not matching stereotypes were often sexualized or inappropriate representations.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed