FMAM-Net: Fusion Multi-Scale Attention Mechanism Network for Building Segmentation in Remote Sensing Images

Autor: Huanran Ye, Run Zhou, Jianhao Wang, Zhiliang Huang
Jazyk: angličtina
Rok vydání: 2022
Předmět:
Zdroj: IEEE Access, Vol 10, Pp 134241-134251 (2022)
Druh dokumentu: article
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2022.3231362
Popis: As the largest target in remote sensing images, buildings have important application value in urban planning and old city reconstruction. However, most networks have poor recognition ability on high resolution images, resulting in blurred boundaries in the segmented building maps. Then, the similarity between buildings and backgrounds will lead to inter-class indistinction. Finally, the diversity of buildings brings difficulties to segmentation, which requires the network to have better generalization ability. To address these problems, we propose Fusion Multi-scale Attention Mechanism Network (FMAM-Net). Firstly, we design Feature Refine Compensation Module(FRCM) to improve the boundary ambiguity problem, including Feature Refinement Module(FRM) and Feature Compensation Module(FCM). FRM utilizes the densely connected architecture to refine features and increase recognition capabilities. FCM introduces low-level features to make up for the lack of boundary information in high-level features. Secondly, to handle inter-class indistinction, we design Tandem Attention Module(TAM) and Parallel Attention Module(PAM). TAM is designed to sequentially filter some features from channels and spaces for adaptive feature refinement. PAM combines context information and uses high-level features to guide low-level features to select more distinguishable features. Finally, based on the binary cross entropy loss function, we add an evaluation index to reduce the error caused by determining the optimization direction only through cross entropy. On the Inria Aerial Image Labeling Dataset, FMAM-Net achieves mean IoU of 85.34%, which is 5.58% higher than AMUNet and 3.77% higher than our baseline(U-Net ResNet-34). On the WHU Dataset, IoU reached the maximum value of 91.06% on FMAM-Net, 1.67% higher than SARB-UNet and 0.2% higher than MAP-Net. The visualization results show that FMAM-Net improves the fuzzy boundary of building segmentation and reduces the inter-class indistinction.
Databáze: Directory of Open Access Journals