Flare Removal Model Based on Sparse-UFormer Networks

Autor: Siqi Wu, Fei Liu, Yu Bai, Houzeng Han, Jian Wang, Ning Zhang
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: Entropy, Vol 26, Iss 8, p 627 (2024)
Druh dokumentu: article
ISSN: 1099-4300
DOI: 10.3390/e26080627
Popis: When a camera lens is directly faced with a strong light source, image flare commonly occurs, significantly reducing the clarity and texture of the photo and interfering with image processing tasks that rely on visual sensors, such as image segmentation and feature extraction. A novel flare removal network, the Sparse-UFormer neural network, has been developed. The network integrates two core components onto the UFormer architecture: the mixed-scale feed-forward network (MSFN) and top-k sparse attention (TKSA), creating the sparse-transformer module. The MSFN module captures rich multi-scale information, enabling the more effective addressing of flare interference in images. The TKSA module, designed with a sparsity strategy, focuses on key features within the image, thereby significantly enhancing the precision and efficiency of flare removal. Furthermore, in the design of the loss function, besides the conventional flare, background, and reconstruction losses, a structural similarity index loss has been incorporated to ensure the preservation of image details and structure while removing the flare. Ensuring the minimal loss of image information is a fundamental premise for effective image restoration. The proposed method has been demonstrated to achieve state-of-the-art performance on the Flare7K++ test dataset and in challenging real-world scenarios, proving its effectiveness in removing flare artefacts from images.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje