A transgender individual experienced psychological distress and changed their physical appearance after FaceApp and similar AI systems consistently misgendered them based on eyebrow thickness and glasses, leading them to overpluck their eyebrows and stop wearing glasses.
A transgender person reported using FaceApp, an AI-powered gender detection application, to validate their gender identity. The individual discovered that the AI system consistently classified them as male when they had thick eyebrows or wore glasses, and as female when they had thin eyebrows or no glasses. This pattern was consistent across multiple AI applications including FaceApp and a how-old website. The person tested this by taking identical photos of themselves and manipulating only their eyebrow appearance using concealer, which resulted in different gender classifications by the AI. The inconsistent and seemingly arbitrary gender classifications caused the individual psychological distress and dysphoria. As a result of this experience, the person modified their physical appearance by overplucking their eyebrows and stopped wearing glasses to avoid triggering dysphoria from potential misgendering by AI systems. The incident highlights how AI gender classification systems can negatively impact transgender individuals by reinforcing narrow gender stereotypes and causing psychological harm when the systems fail to accurately reflect the person's gender identity.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed