Autor: |
You He, Hanchao Zhang, Xiaogang Ning, Ruiqian Zhang, Dong Chang, Minghui Hao |
Jazyk: |
angličtina |
Rok vydání: |
2023 |
Předmět: |
|
Zdroj: |
Remote Sensing, Vol 15, Iss 16, p 4095 (2023) |
Druh dokumentu: |
article |
ISSN: |
2072-4292 |
DOI: |
10.3390/rs15164095 |
Popis: |
Semantic change detection (SCD) is a challenging task in remote sensing, which aims to locate and identify changes between the bi-temporal images, providing detailed “from-to” change information. This information is valuable for various remote sensing applications. Recent studies have shown that multi-task networks, with dual segmentation branches and single change branch, are effective in SCD tasks. However, these networks primarily focus on extracting contextual information and ignore spatial details, resulting in the missed or false detection of small targets and inaccurate boundaries. To address the limitations of the aforementioned methods, this paper proposed a spatial-temporal semantic perception network (STSP-Net) for SCD. It effectively utilizes spatial detail information through the detail-aware path (DAP) and generates spatial-temporal semantic-perception features through combining deep contextual features. Meanwhile, the network enhances the representation of semantic features in spatial and temporal dimensions by leveraging a spatial attention fusion module (SAFM) and a temporal refinement detection module (TRDM). This augmentation results in improved sensitivity to details and adaptive performance balancing between semantic segmentation (SS) and change detection (CD). In addition, by incorporating the invariant consistency loss function (ICLoss), the proposed method constrains the consistency of land cover (LC) categories in invariant regions, thereby improving the accuracy and robustness of SCD. The comparative experimental results on three SCD datasets demonstrate the superiority of the proposed method in SCD. It outperforms other methods in various evaluation metrics, achieving a significant improvement. The Sek improvements of 2.84%, 1.63%, and 0.78% have been observed, respectively. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|
Nepřihlášeným uživatelům se plný text nezobrazuje |
K zobrazení výsledku je třeba se přihlásit.
|