K-Means Clustering Guided Generative Adversarial Networks for SAR-Optical Image Matching

Autor: Xiaolin Tian, Wenliang Du, Yong Zhou, Jiaqi Zhao
Rok vydání: 2020
Předmět:
Zdroj: IEEE Access, Vol 8, Pp 217554-217572 (2020)
ISSN: 2169-3536
DOI: 10.1109/access.2020.3042213
Popis: Synthetic Aperture Radar and optical (SAR-optical) image matching is a technique of finding correspondences between SAR and optical images. SAR-optical image matching can be simplified to single-mode image matching through image synthesis. However, the existing SAR-optical image synthesis methods are unable to provide qualified images for SAR-optical image matching. In this work, we present a K-means Clustering Guide Generative Adversarial Networks (KCG-GAN) to improve the image quality of synthesizing by constraining spatial information synthesis. KCG-GAN uses k-means segmentations as one of the image generator's inputs and introduces feature matching loss, segmentation loss, and L1 loss to the objective function. Meanwhile, to provide repeatable k-means segmentations, we develop a straightforward 1D k-means algorithm. We compare KCG-GAN with a leading image synthesis method-pix2pixHD. Qualitative results illustrate that KCG-GAN preserves more spatial structures than pix2pixHD. Quantitative results show that, compared with pix2pixHD, images synthesized by KCG-GAN are more similar to original optical images, and SAR-optical image matching based on KCG-GAN obtains at most 3.15 times more qualified matchings. Robustness tests demonstrate that SAR-optical image matching based on KCG-GAN is robust to rotation and scale changing. We also test three SIFT-like algorithms on matching original SAR-optical image pairs and matching KCG-GAN synthesized optical-optical image pairs. Experimental results show that our KCG-GAN significantly improves the performances of the three algorithms on SAR-optical image matching.
Databáze: OpenAIRE