Unpaired Image-to-Sketch Translation Network for Sketch Synthesis

Autor: Jie Yang, Yue Zhang, Guoyao Su, Yonggang Qi
Rok vydání: 2019
Předmět:
Zdroj: VCIP
DOI: 10.1109/vcip47243.2019.8965725
Popis: Image-to-sketch translation is to learn the mapping between an image and a corresponding human drawn sketch. Machine can be trained to mimic the human drawing process using a training set of aligned image-sketch pairs. However, to collect such paired data is quite expensive or even unavailable for many cases since sketches exhibit various level of abstractness and drawing preferences. Hence we present an approach for learning an image-to-sketch translation network via unpaired examples. A translation network, which can translate the representation in image latent space to sketch domain, is trained in unsupervised setting. To prevent the problem of representation shifting in cross-domain translation, a novel cycle+ consistency loss is explored. Experimental results on sketch recognition and sketch-based image retrieval demonstrate the effectiveness of our approach.
Databáze: OpenAIRE