Cross-Modality Knowledge Transfer for Prostate Segmentation from CT Scans
Autor: | Joseph N. Stember, Naji Khosravan, Sachin Jambawalikar, Jonathan Shoag, Yu-Cheng Liu, Yulin Liu, Ulas Bagci |
---|---|
Rok vydání: | 2019 |
Předmět: |
Ground truth
Modality (human–computer interaction) Similarity (geometry) Computer science business.industry Deep learning ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION Pattern recognition Image segmentation Field (computer science) Medical imaging Segmentation Artificial intelligence business |
Zdroj: | Domain Adaptation and Representation Transfer and Medical Image Learning with Less Labels and Imperfect Data ISBN: 9783030333904 DART/MIL3ID@MICCAI |
DOI: | 10.1007/978-3-030-33391-1_8 |
Popis: | Creating large scale high-quality annotations is a known challenge in medical imaging. In this work, based on the CycleGAN algorithm, we propose leveraging annotations from one modality to be useful in other modalities. More specifically, the proposed algorithm creates highly realistic synthetic CT images (SynCT) from prostate MR images using unpaired data sets. By using SynCT images (without segmentation labels) and MR images (with segmentation labels available), we have trained a deep segmentation network for precise delineation of prostate from real CT scans. For the generator in our CycleGAN, the cycle consistency term is used to guarantee that SynCT shares the identical manually-drawn, high-quality masks originally delineated on MR images. Further, we introduce a cost function based on structural similarity index (SSIM) to improve the anatomical similarity between real and synthetic images. For segmentation followed by the SynCT generation from CycleGAN, automatic delineation is achieved through a 2.5D Residual U-Net. Quantitative evaluation demonstrates comparable segmentation results between our SynCT and radiologist drawn masks for real CT images, solving an important problem in medical image segmentation field when ground truth annotations are not available for the modality of interest. |
Databáze: | OpenAIRE |
Externí odkaz: |