Facial detection testing software used in schools fails to recognize students of color, causing them to be flagged to their teachers despite being present for tests and classes.
Testing and recognition software that is essential for students to use for tests and take classes has been found to fail at recognizing non-white skin tones. The report describes the case of a student named Amaya who encountered software that failed to recognize her because of her skin tone. The facial detection testing software used by schools fails to recognize non-white skin tone faces over half the time. This causes students of color to be flagged to their teachers even when they are present and attempting to participate in testing or class activities. The software appears to be widely deployed across schools and is described as essential for student testing and class participation. Mozilla approached the creators to help tell Amaya's story of encountering this discriminatory software. The incident highlights systematic bias in facial recognition technology used in educational settings.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Accuracy and effectiveness of AI decisions and actions are dependent on group membership, where decisions in AI system design and biased training data lead to unequal outcomes, reduced benefits, increased effort, and alienation of users.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed