Cross-Modal Prostate Cancer Segmentation via Self-Attention Distillation.

Autor: Zhang, Guokai, Shen, Xiaoang, Zhang, Yu-Dong, Luo, Ye, Luo, Jihao, Zhu, Dandan, Yang, Hanmei, Wang, Weigang, Zhao, Binghui, Lu, Jianwei
Předmět:
Zdroj: IEEE Journal of Biomedical & Health Informatics; Nov2022, Vol. 26 Issue 11, p5298-5309, 12p
Abstrakt: The automatic and accurate segmentation of the prostate cancer from the multi-modal magnetic resonance images is of prime importance for the disease assessment and follow-up treatment plan. However, how to use the multi-modal image features more efficiently is still a challenging problem in the field of medical image segmentation. In this paper, we develop a cross-modal self-attention distillation network by fully exploiting the encoded information of the intermediate layers from different modalities, and the generated attention maps of different modalities enable the model to transfer significant and discriminative information that contains more details. Moreover, a novel spatial correlated feature fusion module is further employed for learning more complementary correlation and non-linear information of different modality images. We evaluate our model in five-fold cross-validation on 358 MRI images with biopsy confirmed. Without bells and whistles, our proposed network achieves state-of-the-art performance on extensive experiments. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index