2D/3D Deep Registration for Real-Time Prostate Biopsy Navigation

Autor: Jocelyne Troccaz, Clément Beitone, Tamara Dupuy, Sandrine Voros
Přispěvatelé: Gestes Medico-chirurgicaux Assistés par Ordinateur (TIMC-GMCAO), Translational Innovation in Medicine and Complexity / Recherche Translationnelle et Innovation en Médecine et Complexité - UMR 5525 (TIMC ), VetAgro Sup - Institut national d'enseignement supérieur et de recherche en alimentation, santé animale, sciences agronomiques et de l'environnement (VAS)-Centre National de la Recherche Scientifique (CNRS)-Université Grenoble Alpes (UGA)-Institut polytechnique de Grenoble - Grenoble Institute of Technology (Grenoble INP ), Université Grenoble Alpes (UGA)-VetAgro Sup - Institut national d'enseignement supérieur et de recherche en alimentation, santé animale, sciences agronomiques et de l'environnement (VAS)-Centre National de la Recherche Scientifique (CNRS)-Université Grenoble Alpes (UGA)-Institut polytechnique de Grenoble - Grenoble Institute of Technology (Grenoble INP ), Université Grenoble Alpes (UGA), ANR MIAI, and CAMI grants plus PronavIA Aura grant, SPIE, ANR-19-P3IA-0003,MIAI,MIAI @ Grenoble Alpes(2019), ANR-11-LABX-0004,CAMI,Gestes Médico-Chirurgicaux Assistés par Ordinateur(2011)
Jazyk: angličtina
Rok vydání: 2021
Předmět:
Zdroj: SPIE Medical Imaging 2021
SPIE Medical Imaging 2021, SPIE, Feb 2021, Digital congress, United States. pp.115981P, ⟨10.1117/12.2579874⟩
DOI: 10.1117/12.2579874⟩
Popis: Cum Laude best poster award (topic : Image-Guided Procedures, Robotic Interventions, and Modeling); International audience; The accuracy of biopsy sampling and the related tumor localization are major issues for prostate cancer diagnosis and therapy. However, the ability to navigate accurately to biopsy targets faces several difficulties coming from both transrectal ultrasound (TRUS) image guidance properties and prostate motion or deformation. To reduce inaccuracy and exam duration, the main objective of this study is to develop a real-time navigation assistance. The aim is to provide the current probe position and orientation with respect to the deformable organ and the next biopsy targets. We propose a deep learning real-time 2D/3D registration method based on Convolutional Neural Networks (CNN) to localize the current 2D US image relative to the available 3D TRUS reference volume. We experiment several scenarii combining different input data including: pair of successive 2D US images, the optical flow between them and current probe tracking information. The main novelty of our study is to consider prior navigation trajectory information by introducing previous registration result. This model is evaluated on clinical data through simulated biopsy trajectories. The results highlight significant improvement by exploiting trajectory information especially through prior registration results and probe tracking parameters. With such trajectory information, we achieve an average registration error of 2.21 mm ± 2.89. The network demonstrates efficient generalization capabilities on new patients and new trajectories, which is promising for successful continuous tracking during biopsy procedure.
Databáze: OpenAIRE