Learning Hierarchical Attention for Weakly-Supervised Chest X-Ray Abnormality Localization and Diagnosis
Autor: | Srikrishna Karanam, Jiayu Huo, Terrence Chen, Ziyan Wu, Xi Ouyang, Jie-Zhi Cheng, Xiang Sean Zhou, Qian Wang |
---|---|
Rok vydání: | 2021 |
Předmět: |
FOS: Computer and information sciences
Computer science Computer Vision and Pattern Recognition (cs.CV) Computer Science - Computer Vision and Pattern Recognition computer.software_genre Machine learning 030218 nuclear medicine & medical imaging Task (project management) 03 medical and health sciences 0302 clinical medicine Medical imaging Electrical and Electronic Engineering Interpretability Radiological and Ultrasound Technology business.industry X-Rays Deep learning Location awareness Computer Science Applications Visualization Radiography Task analysis Key (cryptography) Artificial intelligence business computer Algorithms Software |
Zdroj: | IEEE Transactions on Medical Imaging. 40:2698-2710 |
ISSN: | 1558-254X 0278-0062 |
Popis: | We consider the problem of abnormality localization for clinical applications. While deep learning has driven much recent progress in medical imaging, many clinical challenges are not fully addressed, limiting its broader usage. While recent methods report high diagnostic accuracies, physicians have concerns trusting these algorithm results for diagnostic decision-making purposes because of a general lack of algorithm decision reasoning and interpretability. One potential way to address this problem is to further train these models to localize abnormalities in addition to just classifying them. However, doing this accurately will require a large amount of disease localization annotations by clinical experts, a task that is prohibitively expensive to accomplish for most applications. In this work, we take a step towards addressing these issues by means of a new attention-driven weakly supervised algorithm comprising a hierarchical attention mining framework that unifies activation- and gradient-based visual attention in a holistic manner. Our key algorithmic innovations include the design of explicit ordinal attention constraints, enabling principled model training in a weakly-supervised fashion, while also facilitating the generation of visual-attention-driven model explanations by means of localization cues. On two large-scale chest X-ray datasets (NIH ChestX-ray14 and CheXpert), we demonstrate significant localization performance improvements over the current state of the art while also achieving competitive classification performance. Our code is available on https://github.com/oyxhust/HAM. |
Databáze: | OpenAIRE |
Externí odkaz: |