Autor: |
Zhang, Yupei, He, Xiuxiu, Tian, Zhen, Jeong, Jiwoong Jason, Lei, Yang, Wang, Tonghe, Zeng, Qiulan, Jani, Ashesh B., Curran, Walter J., Patel, Pretesh, Liu, Tian, Yang, Xiaofeng |
Zdroj: |
IEEE Transactions on Medical Imaging; Jul2020, Vol. 39 Issue 7, p2302-2315, 14p |
Abstrakt: |
Accurate and automatic multi-needle detection in three-dimensional (3D) ultrasound (US) is a key step of treatment planning for US-guided brachytherapy. However, most current studies are concentrated on single-needle detection by only using a small number of images with a needle, regardless of the massive database of US images without needles. In this paper, we propose a workflow for multi-needle detection by considering the images without needles as auxiliary. Concretely, we train position-specific dictionaries on 3D overlapping patches of auxiliary images, where we develop an enhanced sparse dictionary learning method by integrating spatial continuity of 3D US, dubbed order-graph regularized dictionary learning. Using the learned dictionaries, target images are reconstructed to obtain residual pixels which are then clustered in every slice to yield centers. With the obtained centers, regions of interest (ROIs) are constructed via seeking cylinders. Finally, we detect needles by using the random sample consensus algorithm per ROI and then locate the tips by finding the sharp intensity drops along the detected axis for every needle. Extensive experiments were conducted on a phantom dataset and a prostate dataset of 70/21 patients without/with needles. Visualization and quantitative results show the effectiveness of our proposed workflow. Specifically, our method can correctly detect 95% of needles with a tip location error of 1.01 mm on the prostate dataset. This technique provides accurate multi-needle detection for US-guided HDR prostate brachytherapy, facilitating the clinical workflow. [ABSTRACT FROM AUTHOR] |
Databáze: |
Complementary Index |
Externí odkaz: |
|