Jordan's World Bank-funded Takaful algorithm for distributing cash assistance was found to exclude eligible families through biased and oversimplified poverty indicators, affecting 220,000 enrolled families and potentially many more applicants.
The Takaful (Unified Cash Transfer Program) is an algorithmic system funded by the World Bank and managed by Jordan's National Aid Fund that ranks families applying for financial assistance from least poor to poorest using 57 socioeconomic indicators. The system has cost over $1 billion and serves 220,000 families as beneficiaries, though this represents only a small fraction of households living under Jordan's official poverty line. Human Rights Watch conducted 70 interviews over two years and found that the algorithm's secret calculus creates bias and inaccuracies by using indicators like water and electricity consumption, car ownership, and household size that don't reliably reflect poverty. The system reinforces gender-based discrimination because Jordanian women cannot pass citizenship to non-citizen spouses, resulting in lower reportable household sizes. Families reported confusion and distrust about the ranking methodology, with some excluded despite genuine need - such as those who inherited property but lacked financial means, or owned older cars necessary for work. The World Bank is funding similar projects in eight other countries in the Middle East and Africa, with $100 million loaned to Tunisia for machine learning integration into social registries.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed