SDRCNN: A Single-Scale Dense Residual Connected Convolutional Neural Network for Pansharpening

Autor: Yuan Fang, Yuanzhi Cai, Lei Fan
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, Vol 16, Pp 6325-6338 (2023)
Druh dokumentu: article
ISSN: 2151-1535
DOI: 10.1109/JSTARS.2023.3292320
Popis: Pansharpening is a process of fusing a high spatial resolution panchromatic image and a low spatial resolution multispectral (MS) image to create a high-resolution MS image. A novel single-branch, single-scale lightweight convolutional neural network, named SDRCNN, is developed in this article. By using a novel dense residual connected structure and convolution block, SDRCNN achieved a better tradeoff between accuracy and efficiency. The performance of SDRCNN was tested using four datasets from the WorldView-3, WorldView-2, and QuickBird satellites. The compared methods include eight traditional methods (i.e., GS, Gram–Schmidt adaptive, partial replacement adaptive CS, band-related spatial detail, smoothing-filter-based intensity modulation, GLP-CBD, CDIF, and LRTCFPan) and five lightweight deep-learning methods (i.e., pansharpening neural network, PanNet, BayesianNet, DMDNet, and FusionNet). Based on a visual inspection of the pansharpened images created and the associated absolute residual maps, SDRCNN exhibited least spatial detail blurring and spectral distortion, among all the methods considered. The values of the quantitative evaluation metrics were closest to their ideal values when SDRCNN was used. The processing time of SDRCNN was also the shortest among all methods tested. Finally, the effectiveness of each component in the SDRCNN was demonstrated in ablation experiments. All of these confirmed the superiority of SDRCNN.
Databáze: Directory of Open Access Journals