Prototype Knowledge Distillation for Medical Segmentation with Missing Modality
Autor: | Wang, Shuai, Yan, Zipei, Zhang, Daoan, Wei, Haining, Li, Zhongsen, Li, Rui |
---|---|
Rok vydání: | 2023 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | Multi-modality medical imaging is crucial in clinical treatment as it can provide complementary information for medical image segmentation. However, collecting multi-modal data in clinical is difficult due to the limitation of the scan time and other clinical situations. As such, it is clinically meaningful to develop an image segmentation paradigm to handle this missing modality problem. In this paper, we propose a prototype knowledge distillation (ProtoKD) method to tackle the challenging problem, especially for the toughest scenario when only single modal data can be accessed. Specifically, our ProtoKD can not only distillate the pixel-wise knowledge of multi-modality data to single-modality data but also transfer intra-class and inter-class feature variations, such that the student model could learn more robust feature representation from the teacher model and inference with only one single modality data. Our method achieves state-of-the-art performance on BraTS benchmark. The code is available at \url{https://github.com/SakurajimaMaiii/ProtoKD}. Comment: ICASSP 2023. v1:camera ready version; v2: fix typos and release code |
Databáze: | arXiv |
Externí odkaz: |