Multi-Attribute Attention Network for Interpretable Diagnosis of Thyroid Nodules in Ultrasound Images
Autor: | Van T. Manh, Jianqiao Zhou, Xiaohong Jia, Zehui Lin, Wenwen Xu, Zihan Mei, Yijie Dong, Xin Yang, Ruobing Huang, Dong Ni |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2022 |
Předmět: |
ComputingMethodologies_PATTERNRECOGNITION
Acoustics and Ultrasonics Image and Video Processing (eess.IV) FOS: Electrical engineering electronic engineering information engineering Humans Diagnosis Computer-Assisted Thyroid Nodule Electrical and Electronic Engineering Electrical Engineering and Systems Science - Image and Video Processing Tomography X-Ray Computed Instrumentation Ultrasonography |
Popis: | Ultrasound (US) is the primary imaging technique for the diagnosis of thyroid cancer. However, accurate identification of nodule malignancy is a challenging task that can elude less-experienced clinicians. Recently, many computer-aided diagnosis (CAD) systems have been proposed to assist this process. However, most of them do not provide the reasoning of their classification process, which may jeopardize their credibility in practical use. To overcome this, we propose a novel deep learning framework called multi-attribute attention network (MAA-Net) that is designed to mimic the clinical diagnosis process. The proposed model learns to predict nodular attributes and infer their malignancy based on these clinically-relevant features. A multi-attention scheme is adopted to generate customized attention to improve each task and malignancy diagnosis. Furthermore, MAA-Net utilizes nodule delineations as nodules spatial prior guidance for the training rather than cropping the nodules with additional models or human interventions to prevent losing the context information. Validation experiments were performed on a large and challenging dataset containing 4554 patients. Results show that the proposed method outperformed other state-of-the-art methods and provides interpretable predictions that may better suit clinical needs. 11 pages, 7 figures |
Databáze: | OpenAIRE |
Externí odkaz: |