Popis: |
With increasingly rapid development of convolutional neural networks, the field of remote sensing has experienced a significant revitalization. However, understanding and detecting surface changes, which necessitate the identification of high-resolution remote sensing images, remain substantial challenges in achieving precise change detection. Excited deep learning-based change detection techniques often exhibit limitations and lack the necessary precision to detect edge details or other nuanced information in remote sensing images. To address these limitations, we propose a unique semantic segmentation deep learning network, the self-adaptive Siamese network (SASiamNet), specifically devised for enhancing change detection in remote sensing images. The SASiamNet excels in real-time land cover segmentation, adeptly extracting local and global information from images via the backbone residual network. Furthermore, it incorporates a primary feature fusion module to extract and fuse the primary stage feature map, and a high-level information refinement module to refine the resultant feature map. This methodology effectively transmutes low-level semantic information into high-level semantic information, thereby improving the overall detection process. Aimed at empirically testing the effectiveness of the SASiamNet, we utilize two distinct datasets: the public dataset, LEVIR-CD, and a challenging dataset, CDD. The latter is composed of bitemporal images sourced from Google Earth, spanning various regions across China. The experiment results unequivocally demonstrate that our approach outperforms traditional methodologies as well as contemporary state-of-the-art change detection techniques, hence underscoring the efficacy of the SASiamNet in the context of remote sensing image change detection. |