Robust multi-modal prostate cancer classification via feature autoencoder and dual attention

Autor: Bochong Li, Ryo Oka, M.D, Ping Xuan, Yuichiro Yoshimura, PhD, Toshiya Nakaguchi
Jazyk: angličtina
Rok vydání: 2022
Předmět:
Zdroj: Informatics in Medicine Unlocked, Vol 30, Iss , Pp 100923- (2022)
Druh dokumentu: article
ISSN: 2352-9148
DOI: 10.1016/j.imu.2022.100923
Popis: Prostate cancer is the second leading cause of cancer death in men. At present, the methods for classifying early cancer grades on MRI images are mainly focused on single image modality and with low robustness. Therefore, this paper focuses on exploring the method of classifying cancer grades on multi-modality MRI images and maintaining robustness. In this paper, we propose a novel and effective multi-modal convolutional neural network for discriminating prostate cancer clinical severity grade, i.e., Robust Multi-modal Feature Autoencoder Attention net (RMANet); this model greatly improves the accuracy and robustness of the model. T2-weighted and Diffusion-weighted imaging are used in this article. The model consists of two branches, one of them is to learn the overall features of two MRI modalities by building a ten-layer CNN network with two input shared weights, and the other branch uses auto-encoder structure with classical U-net as the backbone to learn specific features of each modality and to improve the robustness of the classification model. In the branch of learning overall features of each modality, the novel dual attention mechanism is added to this branch, through which the attention mechanism can better direct the learning focus of the model to the cancerous regions. Experiments were conducted on the ProstateX dataset and augmented with hospital data. By comparing with other baseline methods, multi-modal input methods, and State-of-the-Art (SOTA) methods, the AUC values obtained by the proposed model (reaching 0.84) in this paper after the test set are higher than other classical models and most recent methods, and the sensitivity values (reaching 0.84) are higher than the recent method.
Databáze: Directory of Open Access Journals