Technical Note: Design and implementation of a high-throughput pipeline for reconstruction and quantitative analysis of CT image data.

Autor: Hoffman J; Department of Radiological Sciences, David Geffen School of Medicine at UCLA, Los Angeles, CA, 90024, USA., Emaminejad N; Physics and Biology in Medicine Graduate Program, David Geffen School of Medicine at UCLA, Los Angeles, CA, 90024, USA., Wahi-Anwar M; Physics and Biology in Medicine Graduate Program, David Geffen School of Medicine at UCLA, Los Angeles, CA, 90024, USA., Kim GH; Department of Radiological Sciences, David Geffen School of Medicine at UCLA, Los Angeles, CA, 90024, USA.; Physics and Biology in Medicine Graduate Program, David Geffen School of Medicine at UCLA, Los Angeles, CA, 90024, USA., Brown M; Department of Radiological Sciences, David Geffen School of Medicine at UCLA, Los Angeles, CA, 90024, USA.; Physics and Biology in Medicine Graduate Program, David Geffen School of Medicine at UCLA, Los Angeles, CA, 90024, USA., Young S; Department of Radiological Sciences, David Geffen School of Medicine at UCLA, Los Angeles, CA, 90024, USA., McNitt-Gray M; Department of Radiological Sciences, David Geffen School of Medicine at UCLA, Los Angeles, CA, 90024, USA.; Physics and Biology in Medicine Graduate Program, David Geffen School of Medicine at UCLA, Los Angeles, CA, 90024, USA.
Jazyk: angličtina
Zdroj: Medical physics [Med Phys] 2019 May; Vol. 46 (5), pp. 2310-2322. Date of Electronic Publication: 2019 Apr 03.
DOI: 10.1002/mp.13401
Abstrakt: Purpose: With recent substantial improvements in modern computing, interest in quantitative imaging with CT has seen a dramatic increase. As a result, the need to both create and analyze large, high-quality datasets of clinical studies has increased as well. At present, no efficient, widely available method exists to accomplish this. The purpose of this technical note is to describe an open-source high-throughput computational pipeline framework for the reconstruction and analysis of diagnostic CT imaging data to conduct large-scale quantitative imaging studies and to accelerate and improve quantitative imaging research.
Methods: The pipeline consists of two, primary "blocks": reconstruction and analysis. Reconstruction is carried out via a graphics processing unit (GPU) queuing framework developed specifically for the pipeline that allows a dataset to be reconstructed using a variety of different parameter configurations such as slice thickness, reconstruction kernel, and simulated acquisition dose. The analysis portion then automatically analyzes the output of the reconstruction using "modules" that can be combined in various ways to conduct different experiments. Acceleration of analysis is achieved using cluster processing. Efficiency and performance of the pipeline are demonstrated using an example 142 subject lung screening cohort reconstructed 36 different ways and analyzed using quantitative emphysema scoring techniques.
Results: The pipeline reconstructed and analyzed the 5112 reconstructed datasets in approximately 10 days, a roughly 72× speedup over previous efforts using the scanner for reconstructions. Tightly coupled pipeline quality assurance software ensured proper performance of analysis modules with regard to segmentation and emphysema scoring.
Conclusions: The pipeline greatly reduced the time from experiment conception to quantitative results. The modular design of the pipeline allows the high-throughput framework to be utilized for other future experiments into different quantitative imaging techniques. Future applications of the pipeline being explored are robustness testing of quantitative imaging metrics, data generation for deep learning, and use as a test platform for image-processing techniques to improve clinical quantitative imaging.
(© 2019 American Association of Physicists in Medicine.)
Databáze: MEDLINE