Decoupling and Coupling Transformation for Arbitrary Style Transfer

Autor: Zhiwei Yan, Yinqi Chen, Yangting Zheng
Rok vydání: 2021
Předmět:
Zdroj: 2021 6th International Conference on Image, Vision and Computing (ICIVC).
DOI: 10.1109/icivc52351.2021.9526936
Popis: Style transfer is a task that extracts the style from the style image to synthesize an output that has the content of the content image. The latest method with better effect transfers style image onto content image via a transformation matrix. These algorithms are based on a variety of assumptions to model the style. However, there are no accurate assumptions of how style and content are combined. They model the features covariance as style and produce copy and paste results in synthesized images which we called texture mapping effect. In this work, we define the coupling form as the multiplication of the style matrix and the content matrix. Then, we derive the form of transformation theoretically and present an arbitrary style transfer approach that decouples content and style from features based on our definition. By this assumption, the stylized image will not introduce a texture mapping effect. We demonstrate the effectiveness of our approach comparisons with the state-of-the-art methods in both style transfer and photorealistic image stylization. The results show that our method has no texture mapping effect and is more in line with the definition of stylization.
Databáze: OpenAIRE