Combined 2D and 3D tracking of surgical instruments for minimally invasive and robotic-assisted surgery
Autor: | Xiaofei Du, Danail Stoyanov, Sebastien Ourselin, Alessio Dore, David J. Hawkes, Maximilian Allan, John D. Kelly |
---|---|
Rok vydání: | 2016 |
Předmět: |
ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION
Biomedical Engineering Health Informatics 02 engineering and technology Instrument tracking and detection Tracking (particle physics) 030218 nuclear medicine & medical imaging Motion 03 medical and health sciences 0302 clinical medicine Robotic Surgical Procedures Minimally invasive surgery 3d tracking 0202 electrical engineering electronic engineering information engineering Humans Minimally Invasive Surgical Procedures Robot-assisted surgery Medicine Radiology Nuclear Medicine and imaging Computer vision ComputingMethodologies_COMPUTERGRAPHICS business.industry Surgical vision General Medicine Surgical Instruments Robotic assisted surgery Computer Graphics and Computer-Aided Design Computer Science Applications Surgery Computer-Assisted Radiology Nuclear Medicine and imaging Invasive surgery Original Article 020201 artificial intelligence & image processing Surgery Computer Vision and Pattern Recognition Artificial intelligence Fast motion business |
Zdroj: | International Journal of Computer Assisted Radiology and Surgery |
ISSN: | 1861-6429 1861-6410 |
Popis: | Purpose Computer-assisted interventions for enhanced minimally invasive surgery (MIS) require tracking of the surgical instruments. Instrument tracking is a challenging problem in both conventional and robotic-assisted MIS, but vision-based approaches are a promising solution with minimal hardware integration requirements. However, vision-based methods suffer from drift, and in the case of occlusions, shadows and fast motion, they can be subject to complete tracking failure. Methods In this paper, we develop a 2D tracker based on a Generalized Hough Transform using SIFT features which can both handle complex environmental changes and recover from tracking failure. We use this to initialize a 3D tracker at each frame which enables us to recover 3D instrument pose over long sequences and even during occlusions. Results We quantitatively validate our method in 2D and 3D with ex vivo data collected from a DVRK controller as well as providing qualitative validation on robotic-assisted in vivo data. Conclusions We demonstrate from our extended sequences that our method provides drift-free robust and accurate tracking. Our occlusion-based sequences additionally demonstrate that our method can recover from occlusion-based failure. In both cases, we show an improvement over using 3D tracking alone suggesting that combining 2D and 3D tracking is a promising solution to challenges in surgical instrument tracking. Electronic supplementary material The online version of this article (doi:10.1007/s11548-016-1393-4) contains supplementary material, which is available to authorized users. |
Databáze: | OpenAIRE |
Externí odkaz: |