Recurrent RLCN-Guided Attention Network for Single Image Deraining

Autor: Masatoshi Okutomi, Yusuke Monno, Yizhou Li
Rok vydání: 2021
Předmět:
Zdroj: MVA
DOI: 10.23919/mva51890.2021.9511405
Popis: Single image deraining is an important yet challenging task due to the ill-posed nature of the problem to derive the rain-free clean image from a rainy image. In this paper, we propose Recurrent RLCN-Guided Attention Network (RRANet) for single image deraining. Our main technical contributions lie in threefold: (i) We propose rectified local contrast normalization (RLCN) to apply to the input rainy image to effectively mark candidates of rain regions. (ii) We propose RLCN-guided attention module (RLCN-GAM) to learn an effective attention map for the deraining without the necessity of ground-truth rain masks. (iii) We incorporate RLCN-GAM into a recurrent neural network to progressively derive the rainy-to-clean image mapping. The quantitative and qualitative evaluations using representative deraining benchmark datasets demonstrate that our proposed RRANet outperforms existing state-of-the-art deraining methods, where it is particularly noteworthy that our method clearly achieves the best performance on a realworld dataset.
Databáze: OpenAIRE