PSANet: Automatic colourisation using position‐spatial attention for natural images

Autor: Peng‐Jie Zhu, Yuan‐Yuan Pu, Qiuxia Yang, Siqi Li, Zheng‐Peng Zhao, Hao Wu, Dan Xu
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: IET Computer Vision, Vol 18, Iss 7, Pp 922-934 (2024)
Druh dokumentu: article
ISSN: 1751-9640
1751-9632
DOI: 10.1049/cvi2.12291
Popis: Abstract Due to the richness of natural image semantics, natural image colourisation is a challenging problem. Existing methods often suffer from semantic confusion due to insufficient semantic understanding, resulting in unreasonable colour assignments, especially at the edges of objects. This phenomenon is referred to as colour bleeding. The authors have found that using the self‐attention mechanism benefits the model's understanding and recognition of object semantics. However, this leads to another problem in colourisation, namely dull colour. With this in mind, a Position‐Spatial Attention Network(PSANet) is proposed to address the colour bleeding and the dull colour. Firstly, a novel new attention module called position‐spatial attention module (PSAM) is introduced. Through the proposed PSAM module, the model enhances the semantic understanding of images while solving the dull colour problem caused by self‐attention. Then, in order to further prevent colour bleeding on object boundaries, a gradient‐aware loss is proposed. Lastly, the colour bleeding phenomenon is further improved by the combined effect of gradient‐aware loss and edge‐aware loss. Experimental results show that this method can reduce colour bleeding largely while maintaining good perceptual quality.
Databáze: Directory of Open Access Journals