Interpretable Outcome Prediction with Sparse Bayesian Neural Networks in Intensive Care

Autor: Overweg, Hiske, Popkes, Anna-Lena, Ercole, Ari, Li, Yingzhen, Hernández-Lobato, José Miguel, Zaykov, Yordan, Zhang, Cheng
Rok vydání: 2019
Předmět:
Druh dokumentu: Working Paper
Popis: Clinical decision making is challenging because of pathological complexity, as well as large amounts of heterogeneous data generated as part of routine clinical care. In recent years, machine learning tools have been developed to aid this process. Intensive care unit (ICU) admissions represent the most data dense and time-critical patient care episodes. In this context, prediction models may help clinicians determine which patients are most at risk and prioritize care. However, flexible tools such as artificial neural networks (ANNs) suffer from a lack of interpretability limiting their acceptability to clinicians. In this work, we propose a novel interpretable Bayesian neural network architecture which offers both the flexibility of ANNs and interpretability in terms of feature selection. In particular, we employ a sparsity inducing prior distribution in a tied manner to learn which features are important for outcome prediction. We evaluate our approach on the task of mortality prediction using two real-world ICU cohorts. In collaboration with clinicians we found that, in addition to the predicted outcome results, our approach can provide novel insights into the importance of different clinical measurements. This suggests that our model can support medical experts in their decision making process.
Databáze: arXiv