Unpaired Image-to-Sketch Translation Network for Sketch Synthesis
Autor: | Jie Yang, Yue Zhang, Guoyao Su, Yonggang Qi |
---|---|
Rok vydání: | 2019 |
Předmět: |
Sketch recognition
business.industry Computer science 05 social sciences 010501 environmental sciences Translation (geometry) computer.software_genre 01 natural sciences Sketch Domain (software engineering) Image (mathematics) Consistency (database systems) 0502 economics and business Artificial intelligence 050207 economics business Representation (mathematics) Image retrieval computer Natural language processing 0105 earth and related environmental sciences |
Zdroj: | VCIP |
DOI: | 10.1109/vcip47243.2019.8965725 |
Popis: | Image-to-sketch translation is to learn the mapping between an image and a corresponding human drawn sketch. Machine can be trained to mimic the human drawing process using a training set of aligned image-sketch pairs. However, to collect such paired data is quite expensive or even unavailable for many cases since sketches exhibit various level of abstractness and drawing preferences. Hence we present an approach for learning an image-to-sketch translation network via unpaired examples. A translation network, which can translate the representation in image latent space to sketch domain, is trained in unsupervised setting. To prevent the problem of representation shifting in cross-domain translation, a novel cycle+ consistency loss is explored. Experimental results on sketch recognition and sketch-based image retrieval demonstrate the effectiveness of our approach. |
Databáze: | OpenAIRE |
Externí odkaz: |