Interpretation and visualization techniques for deep learning models in medical imaging
Autor: | Amy J Weisman, Robert Jeraj, Daniel T Huff |
---|---|
Rok vydání: | 2020 |
Předmět: |
Diagnostic Imaging
media_common.quotation_subject medical imaging review Machine learning computer.software_genre Article 030218 nuclear medicine & medical imaging Image (mathematics) 03 medical and health sciences 0302 clinical medicine Deep Learning Medical imaging Image Processing Computer-Assisted Humans Radiology Nuclear Medicine and imaging interpretation visualization Interpretability media_common Class (computer programming) Creative visualization Radiological and Ultrasound Technology business.industry saliency Interpretation (philosophy) Deep learning Visualization 030220 oncology & carcinogenesis Artificial intelligence business computer |
Zdroj: | Physics in medicine and biology |
ISSN: | 1361-6560 |
Popis: | Deep learning (DL) approaches to medical image analysis tasks have recently become popular; however, they suffer from a lack of human interpretability critical for both increasing understanding of the methods’ operation and enabling clinical translation. This review summarizes currently available methods for performing image model interpretation and critically evaluates published uses of these methods for medical imaging applications. We divide model interpretation in two categories: (1) understanding model structure and function and (2) understanding model output. Understanding model structure and function summarizes ways to inspect the learned features of the model and how those features act on an image. We discuss techniques for reducing the dimensionality of high-dimensional data and cover autoencoders, both of which can also be leveraged for model interpretation. Understanding model output covers attribution-based methods, such as saliency maps and class activation maps, which produce heatmaps describing the importance of different parts of an image to the model prediction. We describe the mathematics behind these methods, give examples of their use in medical imaging, and compare them against one another. We summarize several published toolkits for model interpretation specific to medical imaging applications, cover limitations of current model interpretation methods, provide recommendations for DL practitioners looking to incorporate model interpretation into their task, and offer general discussion on the importance of model interpretation in medical imaging contexts. |
Databáze: | OpenAIRE |
Externí odkaz: |