NarxCare, an AI algorithm used by most US prescription drug monitoring programs, incorrectly flagged chronic pain patients with high risk scores, leading to denial of medical care and opioid prescriptions.
NarxCare is an AI-powered analytics platform developed by Appriss (now Bamboo Health) that generates risk scores to predict patients' likelihood of opioid misuse. The system is used by nearly every US state to manage prescription drug monitoring programs and is legally required to be consulted by physicians and pharmacists when prescribing controlled substances. The proprietary machine-learning algorithm draws data from state prescription registries and potentially other sources including medical records, criminal justice data, and EMS data to generate Overdose Risk Scores. Multiple patients reported being denied medical care due to high NarxCare scores, including Kathryn, whose score was elevated because veterinary prescriptions for her sick dogs were recorded under her name, and Beverly Schechtman, who was denied pain medication due to her history of sexual abuse. Research suggests the algorithm produces high rates of false positives, with one study finding only 11 percent of high scorers actually had opioid use disorder. The system has been criticized for lack of transparency, potential racial and gender bias, and for targeting vulnerable patients with complex medical conditions who legitimately see multiple doctors.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed