Lesion attention guided neural network for contrast-enhanced mammography-based biomarker status prediction in breast cancer.

Autor: Qian N; Department of Biomedical Engineering, Medical School, Tianjin University, Tianjin 300072, China; State Key Laboratory of Advanced Medical Materials and Devices, Tianjin University, Tianjin, China., Jiang W; Department of Biomedical Engineering, Medical School, Tianjin University, Tianjin 300072, China; State Key Laboratory of Advanced Medical Materials and Devices, Tianjin University, Tianjin, China; Department of Radiotherapy, Yantai Yuhuangding Hospital, Shandong 264000, China., Wu X; Department of Radiation Oncology, The Affiliated Hospital of Qingdao University, Qingdao 266071, China., Zhang N; Department of Biomedical Engineering, Medical School, Tianjin University, Tianjin 300072, China; State Key Laboratory of Advanced Medical Materials and Devices, Tianjin University, Tianjin, China., Yu H; Department of Biomedical Engineering, Medical School, Tianjin University, Tianjin 300072, China; State Key Laboratory of Advanced Medical Materials and Devices, Tianjin University, Tianjin, China., Guo Y; Department of Biomedical Engineering, Medical School, Tianjin University, Tianjin 300072, China; State Key Laboratory of Advanced Medical Materials and Devices, Tianjin University, Tianjin, China. Electronic address: guoyu@tju.edu.cn.
Jazyk: angličtina
Zdroj: Computer methods and programs in biomedicine [Comput Methods Programs Biomed] 2024 Jun; Vol. 250, pp. 108194. Date of Electronic Publication: 2024 Apr 22.
DOI: 10.1016/j.cmpb.2024.108194
Abstrakt: Background and Objective: Accurate identification of molecular biomarker statuses is crucial in cancer diagnosis, treatment, and prognosis. Studies have demonstrated that medical images could be utilized for non-invasive prediction of biomarker statues. The biomarker status-associated features extracted from medical images are essential in developing medical image-based non-invasive prediction models. Contrast-enhanced mammography (CEM) is a promising imaging technique for breast cancer diagnosis. This study aims to develop a neural network-based method to extract biomarker-related image features from CEM images and evaluate the potential of CEM in non-invasive biomarker status prediction.
Methods: An end-to-end learning convolutional neural network with the whole breast images as inputs was proposed to extract CEM features for biomarker status prediction in breast cancer. The network focused on lesion regions and flexibly extracted image features from lesion and peri‑tumor regions by employing supervised learning with a smooth L1-based consistency constraint. An image-level weakly supervised segmentation network based on Vision Transformer with cross attention to contrast images of breasts with lesions against the contralateral breast images was developed for automatic lesion segmentation. Finally, prediction models were developed following further selection of significant features and the implementation of random forest-based classification. Results were reported using the area under the curve (AUC), accuracy, sensitivity, and specificity.
Results: A dataset from 1203 breast cancer patients was utilized to develop and evaluate the proposed method. Compared to the method without lesion attention and with only lesion regions as inputs, the proposed method performed better at biomarker status prediction. Specifically, it achieved an AUC of 0.71 (95 % confidence interval [CI]: 0.65, 0.77) for Ki-67 and 0.73 (95 % CI: 0.65, 0.80) for human epidermal growth factor receptor 2 (HER2).
Conclusions: A lesion attention-guided neural network was proposed in this work to extract CEM image features for biomarker status prediction in breast cancer. The promising results demonstrated the potential of CEM in non-invasively predicting the biomarker statuses in breast cancer.
Competing Interests: Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
(Copyright © 2024 Elsevier B.V. All rights reserved.)
Databáze: MEDLINE