A CNN-based approach for joint segmentation and quantification of nuclei and NORs in AgNOR-stained images.
Autor: | Rönnau MM; Instituto de Informática, Universidade Federal do Rio Grande do Sul, Av. Gonçalves, 9500, Porto Alegre, 91501-970, RS, Brazil. Electronic address: maikel.ronnau@inf.ufrgs.br., Lepper TW; Faculdade de Odontologia, Universidade Federal do Rio Grande do Sul, R. Ramiro Barcelos, 2492, Porto Alegre, 90035-003, RS, Brazil. Electronic address: tatiana.lepper@ufrgs.br., Amaral LN; Faculdade de Odontologia, Universidade Federal do Rio Grande do Sul, R. Ramiro Barcelos, 2492, Porto Alegre, 90035-003, RS, Brazil. Electronic address: luara.amaral@ufrgs.br., Rados PV; Faculdade de Odontologia, Universidade Federal do Rio Grande do Sul, R. Ramiro Barcelos, 2492, Porto Alegre, 90035-003, RS, Brazil. Electronic address: pantelis@ufrgs.br., Oliveira MM; Instituto de Informática, Universidade Federal do Rio Grande do Sul, Av. Gonçalves, 9500, Porto Alegre, 91501-970, RS, Brazil. Electronic address: oliveira@inf.ufrgs.br. |
---|---|
Jazyk: | angličtina |
Zdroj: | Computer methods and programs in biomedicine [Comput Methods Programs Biomed] 2023 Dec; Vol. 242, pp. 107788. Date of Electronic Publication: 2023 Sep 07. |
DOI: | 10.1016/j.cmpb.2023.107788 |
Abstrakt: | Background and Objective: Oral cancer is the sixth most common kind of human cancer. Brush cytology for counting Argyrophilic Nucleolar Organizer Regions (AgNORs) can help early mouth cancer detection, lowering patient mortality. However, the manual counting of AgNORs still in use today is time-consuming, labor-intensive, and error-prone. The goal of our work is to address these shortcomings by proposing a convolutional neural network (CNN) based method to automatically segment individual nuclei and AgNORs in microscope slide images and count the number of AgNORs within each nucleus. Methods: We systematically defined, trained and tested 102 CNNs in the search for a high-performing solution. This included the evaluation of 51 network architectures combining 17 encoders with 3 decoders and 2 loss functions. These CNNs were trained and evaluated on a new AgNOR-stained image dataset of epithelial cells from oral mucosa containing 1,171 images from 48 patients, with ground truth annotated by specialists. The annotations were greatly facilitated by a semi-automatic procedure developed in our project. Overlapping nuclei, which tend to hide AgNORs, thus affecting their true count, were discarded using an automatic solution also developed in our project. Besides the evaluation on the test dataset, the robustness of the best performing model was evaluated against the results produced by a group of human experts on a second dataset. Results: The best performing CNN model on the test dataset consisted of a DenseNet-169 + LinkNet with Focal Loss (DenseNet-169 as encoder and LinkNet as decoder). It obtained a Dice score of 0.90 and intersection over union (IoU) of 0.84. The counting of nuclei and AgNORs achieved precision and recall of 0.94 and 0.90 for nuclei, and 0.82 and 0.74 for AgNORs, respectively. Our solution achieved a performance similar to human experts on a set of 291 images from 6 new patients, obtaining Intraclass Correlation Coefficient (ICC) of 0.91 for nuclei and 0.81 for AgNORs with 95% confidence intervals of [0.89, 0.93] and [0.77, 0.84], respectively, and p-values < 0.001, confirming its statistical significance. Our AgNOR-stained image dataset is the most diverse publicly available AgNOR-stained image dataset in terms of number of patients and the first for oral cells. Conclusions: CNN-based joint segmentation and quantification of nuclei and NORs in AgNOR-stained images achieves expert-like performance levels, while being orders of magnitude faster than the later. Our solution demonstrated this by showing strong agreement with the results produced by a group of specialists, highlighting its potential to accelerate diagnostic workflows. Our trained model, code, and dataset are available and can stimulate new research in early oral cancer detection. Competing Interests: Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. (Copyright © 2023 Elsevier B.V. All rights reserved.) |
Databáze: | MEDLINE |
Externí odkaz: |