The Department of Veterans Affairs' REACH VET AI algorithm for suicide prevention prioritizes white male veterans while ignoring female veterans and survivors of sexual violence, despite rising suicide rates among women veterans.
The Department of Veterans Affairs deployed an AI system called REACH VET in 2017 to identify veterans at highest statistical risk for suicide and target assistance accordingly. The algorithm considers 61 variables and flags 6,700 veterans monthly for extra help. Investigation revealed the system gives preference to veterans who are 'divorced and male' and 'widowed and male' but not to any group of female veterans. Military sexual trauma and intimate partner violence, both linked to elevated suicide risk among female veterans, are not included in the algorithm's variables. Government data shows a 24% rise in suicide rates among female veterans between 2020 and 2021, four times the increase among male veterans during that period. The VA's executive director for suicide prevention stated that military sexual trauma was considered but excluded because it was not among 'the most powerful' predictors. About 1 in 3 women veterans report experiencing military sexual trauma compared to 1 in 50 men. The algorithm identified being a man, especially a white man, as more predictive of suicide than clinical factors known to impact women. Female veterans are the VA's fastest-growing population, with numbers growing from 159,810 in 2001 to over 800,000 today, representing 30% of new VA patients.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
Human
Due to a decision or action made by humans
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed