To explain or not to explain?-Artificial intelligence explainability in clinical decision support systems.

Autor: Amann J; Health Ethics and Policy Lab, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland., Vetter D; Frankfurt Big Data Lab, Goethe University Frankfurt am Main, Germany.; Computational Vision and Artificial Intelligence, Goethe University Frankfurt am Main, Germany., Blomberg SN; University of Copenhagen, Copenhagen Emergency medical Services, Denmark., Christensen HC; University of Copenhagen, Copenhagen Emergency medical Services, Denmark., Coffee M; Department of Medicine and Division of Infectious Diseases and Immunology, NYU Grossman School of Medicine, New York, United States of America., Gerke S; Penn State Dickinson Law, Carlisle, PA, United States of America., Gilbert TK; Digital Life Initiative, Cornell Tech, New York, NY, United States of America., Hagendorff T; Cluster of Excellence 'Machine Learning: New Perspectives for Science'-Ethics & Philosophy Lab University of Tuebingen, Germany., Holm S; Department of Food and Resource Economics, Faculty of Science University of Copenhagen, Denmark., Livne M; Google Health Research, London, United Kingdom., Spezzatti A; Industrial Engineering & Operations Research Department, University of California, Berkeley, United States of America., Strümke I; Department of Holistic Systems, Simula Metropolitan Center for Digital Engineering, Oslo, Norway.; Department of Engineering Cybernetics, Norwegian University of Science and Technology, Trondheim, Norway., Zicari RV; Yrkeshögskolan Arcada, Helsinki, Finland.; Data Science Graduate School, Seoul National University, Seoul, South Korea., Madai VI; QUEST Center for Responsible Research, Berlin Institute of Health (BIH), Charité Universitätsmedizin Berlin, Germany.; CLAIM-Charité Lab for Artificial Intelligence in Medicine, Charité Universitätsmedizin Berlin, Germany.; School of Computing and Digital Technology, Faculty of Computing, Engineering and the Built Environment, Birmingham City University, United Kingdom.
Jazyk: angličtina
Zdroj: PLOS digital health [PLOS Digit Health] 2022 Feb 17; Vol. 1 (2), pp. e0000016. Date of Electronic Publication: 2022 Feb 17 (Print Publication: 2022).
DOI: 10.1371/journal.pdig.0000016
Abstrakt: Explainability for artificial intelligence (AI) in medicine is a hotly debated topic. Our paper presents a review of the key arguments in favor and against explainability for AI-powered Clinical Decision Support System (CDSS) applied to a concrete use case, namely an AI-powered CDSS currently used in the emergency call setting to identify patients with life-threatening cardiac arrest. More specifically, we performed a normative analysis using socio-technical scenarios to provide a nuanced account of the role of explainability for CDSSs for the concrete use case, allowing for abstractions to a more general level. Our analysis focused on three layers: technical considerations, human factors, and the designated system role in decision-making. Our findings suggest that whether explainability can provide added value to CDSS depends on several key questions: technical feasibility, the level of validation in case of explainable algorithms, the characteristics of the context in which the system is implemented, the designated role in the decision-making process, and the key user group(s). Thus, each CDSS will require an individualized assessment of explainability needs and we provide an example of how such an assessment could look like in practice.
Competing Interests: VIM reported receiving personal fees from ai4medicine outside the submitted work. There is no connection, commercial exploitation, transfer or association between the projects of ai4medicine and the results presented in this work.
(Copyright: © 2022 Amann et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.)
Databáze: MEDLINE