Google's Photos app in 2015 mislabeled photos of Black people as 'gorillas', and eight years later, Google and other tech companies still disable primate recognition features to avoid repeating such offensive mistakes.
In May 2015, Google released its standalone Photos app with AI-powered image analysis to label people, places and things. A couple months later, software developer Jacky Alciné discovered that Google had labeled photos of him and a Black friend as 'gorillas', an offensive term echoing racist tropes. Google apologized and prevented its software from categorizing anything as gorillas. Eight years later, testing revealed Google Photos still fails to find images when searching for gorillas, baboons, chimpanzees, orangutans and monkeys, despite having images of these primates. Apple Photos showed similar issues, only finding 'gorilla' when the text appeared in photos like Gorilla Tape. Microsoft OneDrive found no animals in searches, while Amazon Photos was over-inclusive, showing various primates for gorilla searches. The problem stemmed from insufficient photos of Black people in training data, causing the AI to be unfamiliar with darker-skinned people. Google encountered a similar issue during internal testing of its Nest camera, which mistook some Black people for animals, but fixed it before public release. The companies chose to disable problematic features rather than fix the underlying bias issues.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed