Using Explainable Artificial Intelligence Models (ML) to Predict Suspected Diagnoses as Clinical Decision Support

Autor: Zully, Ritter, Stefan, Vogel, Frank, Schultze, Kerstin, Pischek-Koch, Wiebke, Schirrmeister, Felix, Walcher, Rainer, Röhrig, Tibor, Kesztyüs, Dagmar, Krefting, Sabine, Blaschke
Rok vydání: 2022
Předmět:
Zdroj: Studies in health technology and informatics. 294
ISSN: 1879-8365
Popis: The complexity of emergency cases and the number of emergency patients have increased dramatically. Due to a reduced or even missing specialist medical staff in the emergency departments (EDs), medical knowledge is often used without professional supervision for the diagnosis. The result is a failure in diagnosis and treatment, even death in the worst case. Secondary: high expenditure of time and high costs. Using accurate patient data from the German national registry of the medical emergency departments (AKTIN-registry, Home - Notaufnahmeregister (aktin.org)), the most 20 frequent diagnoses were selected for creating explainable artificial intelligence (XAI) models as part of the ENSURE project (ENSURE (umg.eu)). 137.152 samples and 51 features (vital signs and symptoms) were analyzed. The XAI models achieved a mean area under the curve (AUC) one-vs-rest of 0.98 for logistic regression (LR) and 0.99 for the random forest (RF), and predictive accuracies of 0.927 (LR) and 0.99 (RF). Based on its grade of explainability and performance, the best model will be incorporated into a portable CDSS to improve diagnoses and outcomes of ED treatment and reduce cost. The CDSS will be tested in a clinical pilot study at EDs of selected hospitals in Germany.
Databáze: OpenAIRE