Automatic intraoperative stitching of nonoverlapping cone-beam CT acquisitions
Autor: | Mehran Armand, Stefan Reichenstein, Greg Osgood, Javad Fotouhi, Alex Johnson, Mathias Unberath, Bernhard Fuerst, Nassir Navab, Sing Chun Lee |
---|---|
Rok vydání: | 2018 |
Předmět: |
Time Factors
Infrared Rays Swine Computer science Video Recording Simultaneous localization and mapping Tracking (particle physics) Article Pattern Recognition Automated 030218 nuclear medicine & medical imaging Image stitching Intraoperative Period 03 medical and health sciences Imaging Three-Dimensional 0302 clinical medicine Fiducial Markers Animals Humans Minimally Invasive Surgical Procedures Orthopedic Procedures Computer vision Femur Pose Cone beam ct 030222 orthopedics Phantoms Imaging business.industry Tracking system General Medicine Cone-Beam Computed Tomography Calibration RGB color model Artificial intelligence business |
Zdroj: | Medical Physics. 45:2463-2475 |
ISSN: | 0094-2405 |
DOI: | 10.1002/mp.12877 |
Popis: | PURPOSE Cone-beam computed tomography (CBCT) is one of the primary imaging modalities in radiation therapy, dentistry, and orthopedic interventions. While CBCT provides crucial intraoperative information, it is bounded by a limited imaging volume, resulting in reduced effectiveness. This paper introduces an approach allowing real-time intraoperative stitching of overlapping and nonoverlapping CBCT volumes to enable 3D measurements on large anatomical structures. METHODS A CBCT-capable mobile C-arm is augmented with a red-green-blue-depth (RGBD) camera. An offline cocalibration of the two imaging modalities results in coregistered video, infrared, and x-ray views of the surgical scene. Then, automatic stitching of multiple small, nonoverlapping CBCT volumes is possible by recovering the relative motion of the C-arm with respect to the patient based on the camera observations. We propose three methods to recover the relative pose: RGB-based tracking of visual markers that are placed near the surgical site, RGBD-based simultaneous localization and mapping (SLAM) of the surgical scene which incorporates both color and depth information for pose estimation, and surface tracking of the patient using only depth data provided by the RGBD sensor. RESULTS On an animal cadaver, we show stitching errors as low as 0.33, 0.91, and 1.72 mm when the visual marker, RGBD SLAM, and surface data are used for tracking, respectively. CONCLUSIONS The proposed method overcomes one of the major limitations of CBCT C-arm systems by integrating vision-based tracking and expanding the imaging volume without any intraoperative use of calibration grids or external tracking systems. We believe this solution to be most appropriate for 3D intraoperative verification of several orthopedic procedures. |
Databáze: | OpenAIRE |
Externí odkaz: |