Epic Systems' AI sepsis detection model, used by 170+ hospitals, failed to identify 67% of sepsis cases and generated 88% false alarms when tested at University of Michigan, potentially compromising patient care for thousands.
Epic Systems, America's largest electronic health records company maintaining data for 180 million U.S. patients, developed the Epic Sepsis Model (ESM) to predict sepsis onset in hospitals. The AI algorithm was deployed at over 170 hospitals and health systems since 2017 without FDA review. A University of Michigan study published in JAMA Internal Medicine examined 38,455 patient records from December 2018 to October 2019, finding that ESM only identified 843 of 2,552 sepsis cases (33% sensitivity), missing 1,709 cases (67%). Of 6,971 sepsis alerts generated, only 843 were correct, creating an 88% false positive rate. The study found ESM identified only 7% of sepsis patients who had not received timely antibiotics. Epic had claimed 76-83% accuracy but no independent validation existed until this study. The company paid hospitals up to $1 million to adopt their algorithms, creating potential conflicts of interest. Epic later updated the model in August 2021 and changed their sepsis definition to match international consensus standards.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed