Autor: |
Hong N. Dao, Tuyen Nguyen, Cherubin Mugisha, Incheon Paik |
Jazyk: |
angličtina |
Rok vydání: |
2024 |
Předmět: |
|
Zdroj: |
IEEE Access, Vol 12, Pp 75496-75507 (2024) |
Druh dokumentu: |
article |
ISSN: |
2169-3536 |
DOI: |
10.1109/ACCESS.2024.3401777 |
Popis: |
Medical image data often face the problem of data scarcity and costly annotation processes. To overcome this, our study introduces a novel transfer learning method for medical image classification. We present a multimodal learning framework that incorporates the pre-trained PubMedCLIP model and multimodal feature fusion. Prompts of different complexities are combined with images as inputs to the proposed model. Our findings demonstrate that this approach significantly enhances image classification tasks while reducing the burden of annotation costs. Our study underscores the potential of PubMedCLIP in revolutionizing medical image analysis through its prompt-based approach and showcases the value of multi-modality for training robust models in healthcare. Code is available at:https://github.com/HongJapan/MTL_prompt_medical.git. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|