New Zealand's passport photo checker rejected Richard Lee's application because facial recognition software incorrectly determined his eyes were closed, when they were actually open.
Richard Lee, a 22-year-old New Zealand citizen of Asian descent studying in Melbourne, attempted to renew his passport online using New Zealand's Department of Internal Affairs automated photo checker. The facial recognition software rejected his photo with the error message 'Subject eyes are closed' despite his eyes being clearly open. Lee contacted the department and was told the issue was due to 'uneven lighting on the face' and 'shadows in his eyes' which made it difficult for the software to process. After multiple rejections, he took new photos at Australia Post and one was eventually accepted. The Department of Internal Affairs stated that up to 20% of photos submitted online are rejected for various reasons, with closed eyes being the most common error, and that this was a generic error message. The department maintained confidence in their system and stated they do not believe it discriminates against any specific individuals. Lee took the incident with humor, posting about it on Facebook where it went viral, though some commenters suggested the technology was racist.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Accuracy and effectiveness of AI decisions and actions are dependent on group membership, where decisions in AI system design and biased training data lead to unequal outcomes, reduced benefits, increased effort, and alienation of users.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed