Towards an Explainable Mortality Prediction Model
Autor: | Sharad Patel, Ghulam Rasool, Ravi P. Ramachandran, Jacob R. Epifano |
---|---|
Rok vydání: | 2020 |
Předmět: |
Current (mathematics)
Artificial neural network Computer science business.industry Robust statistics Mortality prediction model 030208 emergency & critical care medicine Function (mathematics) 010501 environmental sciences Machine learning computer.software_genre 01 natural sciences 03 medical and health sciences 0302 clinical medicine Influence function Mortality prediction Artificial intelligence Layer (object-oriented design) business computer 0105 earth and related environmental sciences |
Zdroj: | MLSP |
DOI: | 10.1109/mlsp49062.2020.9231833 |
Popis: | Influence functions are analytical tools from robust statistics that can help interpret the decisions of black-box machine learning models. Influence functions can be used to attribute changes in the loss function due to small perturbations in the input features. The current work on using influence functions is limited to the features available before the last layer of deep neural networks (DNNs). We extend the influence function approximation to DNNs by computing gradients in an end-to-end manner and relate changes in the loss function to individual input features using an efficient algorithm. We propose an accurate mortality prediction neural network and show the effectiveness of extended influence functions on the eICU dataset. The features chosen by proposed extended influence functions were more like those selected by human experts than those chosen by other traditional methods. |
Databáze: | OpenAIRE |
Externí odkaz: |