MFAFNet: A Multiscale Fully Attention Fusion Network for Remote Sensing Image Semantic Segmentation

Autor: Yuanyuan Dang, Yu Gao, Bing Liu
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: IEEE Access, Vol 12, Pp 123388-123400 (2024)
Druh dokumentu: article
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2024.3451153
Popis: The semantic segmentation of high-resolution remote sensing images is widely used in various precision agriculture, urban planning, and environmental detection are some examples of these industries. Convolutional neural networks (CNNs) are excellent in the semantic segmentation of remote sensing images. CNN excels in extracting local feature details but lacks the ability to model global context data. Therefore, to obtain rich local-global information about context, we describe in this work a semantic segmentation network design technique for remote sensing, based on an encoder-decoder structure, which is named Multiscale Fully Attention Fusion Network for Remote Sensing Image Semantic Segmentation (MFAFNet). In particular, to improve the segmentation efficiency, the encoder’s extractor of features was ResNet18, after which the explicit visual center module EVC and the full attention network FANB are intended to retrieve the detailed global context data. Finally, the gated channel attention fusion module (GCF) tries to augment channel interaction information in the decoder stage while fusing low-level characteristics for efficient aggregation. During our research and testing, we used the publicly available Vaihingen and Potsdam datasets from the International Society for Photogrammetry and Remote Sensing (ISPRS), as well as the LoveDA dataset. Meanwhile, it demonstrates that MFAFNet outperforms other well-liked methods in terms of competition. We further validated the efficiency of the network components in the study by conducting ablation experiments on the Vaihingen dataset.
Databáze: Directory of Open Access Journals