LeSAM: Adapt Segment Anything Model for Medical Lesion Segmentation.

Autor: Gu Y, Wu Q, Tang H, Mai X, Shu H, Li B, Chen Y
Jazyk: angličtina
Zdroj: IEEE journal of biomedical and health informatics [IEEE J Biomed Health Inform] 2024 Oct; Vol. 28 (10), pp. 6031-6041. Date of Electronic Publication: 2024 Oct 03.
DOI: 10.1109/JBHI.2024.3406871
Abstrakt: The Segment Anything Model (SAM) is a foundational model that has demonstrated impressive results in the field of natural image segmentation. However, its performance remains suboptimal for medical image segmentation, particularly when delineating lesions with irregular shapes and low contrast. This can be attributed to the significant domain gap between medical images and natural images on which SAM was originally trained. In this paper, we propose an adaptation of SAM specifically tailored for lesion segmentation termed LeSAM. LeSAM first learns medical-specific domain knowledge through an efficient adaptation module and integrates it with the general knowledge obtained from the pre-trained SAM. Subsequently, we leverage this merged knowledge to generate lesion masks using a modified mask decoder implemented as a lightweight U-shaped network design. This modification enables better delineation of lesion boundaries while facilitating ease of training. We conduct comprehensive experiments on various lesion segmentation tasks involving different image modalities such as CT scans, MRI scans, ultrasound images, dermoscopic images, and endoscopic images. Our proposed method achieves superior performance compared to previous state-of-the-art methods in 8 out of 12 lesion segmentation tasks while achieving competitive performance in the remaining 4 datasets. Additionally, ablation studies are conducted to validate the effectiveness of our proposed adaptation modules and modified decoder.
Databáze: MEDLINE