VIF-Net: An Unsupervised Framework for Infrared and Visible Image Fusion

Autor: Ruichao Hou, Dongming Zhou, Yanbu Guo, Rencan Nie, Dong Liu, Chuanbo Yu, Lei Xiong
Rok vydání: 2020
Předmět:
Zdroj: IEEE Transactions on Computational Imaging. 6:640-651
ISSN: 2334-0118
2573-0436
DOI: 10.1109/tci.2020.2965304
Popis: Visible images provide abundant texture details and environmental information, while infrared images benefit from night-time visibility and suppression of highly dynamic regions; it is a meaningful task to fuse these two types of features from different sensors to generate an informative image. In this article, we propose an unsupervised end-to-end learning framework for infrared and visible image fusion. We first construct enough benchmark training datasets using the visible and infrared frames, which can address the limitation of the training dataset. Additionally, due to the lack of labeled datasets, our architecture is derived from a robust mixed loss function that consists of the modified structural similarity (M-SSIM) metric and the total variation (TV) by designing an unsupervised learning process that can adaptively fuse thermal radiation and texture details and suppress noise interference. In addition, our method is an end to end model, which avoids setting hand-crafted fusion rules and reducing computational cost. Furthermore, extensive experimental results demonstrate that the proposed architecture performs better than state-of-the-art methods in both subjective and objective evaluations.
Databáze: OpenAIRE