Uncertainty-Guided Segmentation Network for Geospatial Object Segmentation

Autor: Hongyu Jia, Wenwu Yang, Lin Wang, Haolin Li
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, Vol 17, Pp 5824-5833 (2024)
Druh dokumentu: article
ISSN: 2151-1535
DOI: 10.1109/JSTARS.2024.3361693
Popis: Geospatial objects pose significant challenges, including dense distribution, substantial interclass variations, and minimal intraclass variations. These complexities make achieving precise foreground object segmentation in high-resolution remote sensing images highly challenging. Current segmentation approaches often rely on the standard encoder–decoder architecture to extract object-related information, but overlook the inherent uncertainty issues that arise during the process. In this article, we aim to enhance segmentation by introducing an uncertainty-guided decoding mechanism and propose the uncertainty-guided segmentation network (UGSNet). Specifically, building upon the conventional encoder–decoder architecture, we initially employ the pyramid vision transformer to extract multilevel features containing extensive long-range information. We then introduce an uncertainty-guided decoding mechanism, addressing both epistemic and aleatoric uncertainties, to progressively refine segmentation with higher certainty at each level. With this uncertainty-guided decoding mechanism, our UGSNet achieves accurate geospatial object segmentation. To validate the effectiveness of UGSNet, we conduct extensive experiments on the large-scale ISAID dataset, and the results unequivocally demonstrate the superiority of our method over other state-of-the-art segmentation methods.
Databáze: Directory of Open Access Journals