Real Women in Trucking filed a discrimination complaint against Meta alleging that Facebook's algorithm shows job advertisements based on gender and age stereotypes, with women less likely to see blue-collar job ads and older workers far less likely to see job ads overall.
Real Women in Trucking filed a complaint with the Equal Employment Opportunity Commission alleging that Meta's Facebook platform uses algorithmic bias in job advertisement delivery. The complaint claims that Facebook's algorithm shows job ads for truckers, construction workers, and firefighters mostly to men, while showing ads for housekeepers, home care workers, and child care workers mostly to women, even when employers request ads be shown to all ages and genders. Data from Facebook's own advertising library shows that recipients of certain job listings were more than 99% male and 99% younger than age 55. The organization alleges that older job seekers are usually far less likely than younger job seekers to receive job ads, and women receive a disproportionate share of ads for lower-paid jobs in social services, food services, education, and health care. This follows previous settlements by Meta in 2019 for $5 million and in June 2022 with the Justice Department over similar housing ad discrimination issues. Civil rights law prohibits steering ads based on gender or age, and the EEOC charge is based on publicly available data from Meta's advertising library.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed