Anti-Aliasing Attention U-net Model for Skin Lesion Segmentation

Autor: Phuong Thi Le, Bach-Tung Pham, Ching-Chun Chang, Yi-Chiung Hsu, Tzu-Chiang Tai, Yung-Hui Li, Jia-Ching Wang
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: Diagnostics, Vol 13, Iss 8, p 1460 (2023)
Druh dokumentu: article
ISSN: 2075-4418
DOI: 10.3390/diagnostics13081460
Popis: The need for a lightweight and reliable segmentation algorithm is critical in various biomedical image-prediction applications. However, the limited quantity of data presents a significant challenge for image segmentation. Additionally, low image quality negatively impacts the efficiency of segmentation, and previous deep learning models for image segmentation require large parameters with hundreds of millions of computations, resulting in high costs and processing times. In this study, we introduce a new lightweight segmentation model, the mobile anti-aliasing attention u-net model (MAAU), which features both encoder and decoder paths. The encoder incorporates an anti-aliasing layer and convolutional blocks to reduce the spatial resolution of input images while avoiding shift equivariance. The decoder uses an attention block and decoder module to capture prominent features in each channel. To address data-related problems, we implemented data augmentation methods such as flip, rotation, shear, translate, and color distortions, which enhanced segmentation efficiency in the international Skin Image Collaboration (ISIC) 2018 and PH2 datasets. Our experimental results demonstrated that our approach had fewer parameters, only 4.2 million, while it outperformed various state-of-the-art segmentation methods.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje