The U.S. government's CBP One mobile app for asylum seekers at the Mexico border systematically failed to recognize darker-skinned individuals due to facial recognition bias, preventing many Black asylum seekers from filing asylum claims.
The U.S. Customs and Border Protection launched the CBP One mobile app in January 2023 as the sole method for migrants to apply for asylum at the southern border. The app requires users to upload photos for facial recognition verification to receive asylum appointments. Multiple nonprofits working with asylum seekers reported that the app's facial recognition technology systematically failed to recognize people with darker skin tones, particularly affecting Haitian and African asylum seekers. Organizations like Sidewalk School, Al Otro Lado, and Haitian Bridge Alliance documented that approximately 4,000 Black asylum seekers in Reynosa and 1,000 Haitians in Matamoros were unable to complete their applications due to repeated error messages when uploading photos. The app was initially available only in English and Spanish before adding Haitian Creole. Advocates developed workarounds including using bright construction lights to illuminate faces during photo uploads, but these solutions didn't work for children under six. The technical failures effectively barred many Black asylum seekers from exercising their legal right to seek asylum, creating a discriminatory barrier based on race and skin tone.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed