Several artificial intelligence algorithms developed by Epic Systems, the nation’s largest electronic health record vendor, are delivering inaccurate or irrelevant information to hospitals about the care of seriously ill patients, contrasting sharply with the company’s published claims, a STAT investigation found.
Employees of several major health systems said they were particularly concerned about Epic’s algorithm for predicting sepsis, a life-threatening complication of infection. The algorithm, they said, routinely fails to identify the condition in advance, and triggers frequent false alarms. Some hospitals reported a benefit for patients after fine-tuning the model, but that process took at least a year.
STAT’s investigation, based on interviews with data scientists, ethics experts, and many of Epic’s largest and most influential clients, underscores the need for extreme caution in using artificial intelligence algorithms to guide the care of patients. Errant alarms may lead to unnecessary care or divert clinicians from treating sicker patients in emergency departments or intensive care units where time and attention are finite resources.
This article is exclusive to STAT+ subscribers
Unlock this article — plus in-depth analysis, newsletters, premium events, and networking platform access.
Already have an account? Log in
Already have an account? Log in
To submit a correction request, please visit our Contact Us page.
STAT encourages you to share your voice. We welcome your commentary, criticism, and expertise on our subscriber-only platform, STAT+ Connect