Coffee Meets Bagel's dating algorithm systematically showed users potential matches primarily from their own racial group, even when users selected 'no preference' for ethnicity, leading to racial segregation in dating matches.
Coffee Meets Bagel, a dating app founded by the Kang sisters and launched in New York with expansion to Los Angeles and 11 additional cities, implemented an algorithm that used ethnicity preferences to determine match pools. The app had approximately 60,000 users who were predominantly white, Asian, Jewish and educated, with only 17% being non-white, non-Jewish, and non-Asian. When users selected 'no preference' for ethnicity, the algorithm interpreted this as permission to show matches based on empirical data showing people are more likely to match within their own ethnic group. Multiple users reported receiving matches almost exclusively from their own racial group: white and Asian women received mostly Asian men, Latino men received only Latina women. The algorithm was designed to maximize connection rates by using data showing that 53% of white women and 74% of Jewish women preferred to date only white men. Users who wanted diverse matches were forced into a Nash equilibrium where they had to exclude certain ethnicities to receive varied matches. The app's founders defended this approach, stating that user behavior data showed people had clear ethnic preferences even when claiming no preference.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed