Cigna used an AI algorithm called PXDX to automatically reject over 300,000 patient insurance claims in two months, spending only 1.2 seconds per review without doctors examining patient files.
Cigna Corporation and Cigna Health and Life Insurance Company deployed an algorithm called PXDX (procedure-to-diagnosis) to process insurance claims. According to a federal class-action lawsuit filed in Sacramento, the system automatically rejected more than 300,000 payment claims in just two months in 2022. The algorithm spent an average of only 1.2 seconds reviewing each claim to identify whether claims met certain requirements. Large batches of rejected claims were then sent to doctors who signed off on the denials without opening patient files or conducting individual reviews. The lawsuit alleges this violated California's requirement for 'thorough, fair, and objective' investigations of medical bills. Two plaintiffs were affected: Suzanne Kisting-Leung was denied coverage for an ultrasound for suspected ovarian cancer risk, and Ayesha Smiley was denied coverage for a vitamin D deficiency test ordered by her doctor. Cigna has 18 million U.S. members including over 2 million in California. The company defended the system as an industry-standard review process for common, low-cost procedures designed to expedite physician reimbursement, stating that reviews occur after treatment so do not deny care.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed