LinkedIn's search algorithm exhibited gender bias by suggesting male names when users searched for female names, but not suggesting female names when users searched for male names.
LinkedIn's search algorithm demonstrated systematic gender bias in its name suggestion feature. When users searched for common female names like 'Stephanie Williams', the system would prompt users asking if they meant to search for male equivalents like 'Stephen Williams', despite there being approximately 2,500 profiles with the female name. This pattern occurred for at least a dozen common female names in the US, with the system suggesting male alternatives for names like Andrea (Andrew), Danielle (Daniel), Michaela (Michael), and Alexa (Alex). However, when users searched for any of the 100 most common male names, the system never suggested female alternatives. LinkedIn stated that its algorithm was based on analysis of past searcher tendencies and relative frequencies of words in past queries and member profiles, claiming it was not related to gender. The company has approximately 450 million members globally. Following media attention, LinkedIn updated its algorithm on September 7th to explicitly recognize people's names so that the system would not suggest alternative names of any gender.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed