A transgender woman named Olivia was subjected to invasive screening by TSA body scanners and officers at Fort Lauderdale airport, ultimately being forced to expose her genitals to clear security, highlighting systematic discrimination against transgender travelers by AI-powered screening technology.
On September 15, 2017, a transgender woman named Olivia encountered discriminatory treatment at Fort Lauderdale-Hollywood International Airport when TSA's millimeter wave body scanner flagged her groin area as suspicious. The scanner, manufactured by L3Harris Technologies and costing approximately $150,000 per unit, is programmed to detect penises on passengers scanned as male and breasts on passengers scanned as female. When TSA officers pressed the female button for Olivia, the AI algorithm interpreted her anatomy as a potential threat and issued an alarm. After multiple pat-downs failed to satisfy officers, Olivia was taken to a private room where she was told she would need to be searched by a male officer, violating TSA policy. When she refused, officers threatened to escort her from the terminal. Ultimately, Olivia pulled down her skirt and underwear to expose her genitals to the female officers to prove she posed no threat. A ProPublica investigation found that 5% of TSA civil rights complaints from 2016-2019 were related to transgender screening, despite transgender people comprising less than 1% of the population. The report documented 174 similar incidents where transgender travelers faced humiliating treatment after being flagged by the AI-powered scanners. TSA has deployed these scanners at nearly every US airport at a cost of approximately $110 million, but the technology continues to systematically discriminate against transgender and gender nonconforming passengers.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed