Learning to Transfer Texture From Clothing Images to 3D Humans
Autor: | Aymen Mir, Thiemo Alldieck, Gerard Pons-Moll |
---|---|
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
Pixel business.industry Computer science Computer Vision and Pattern Recognition (cs.CV) Computer Science - Computer Vision and Pattern Recognition ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION 020207 software engineering 02 engineering and technology Solid modeling Image segmentation 010501 environmental sciences Texture (music) Translation (geometry) 01 natural sciences 0202 electrical engineering electronic engineering information engineering Code (cryptography) Computer vision Artificial intelligence Image warping business ComputingMethodologies_COMPUTERGRAPHICS 0105 earth and related environmental sciences |
Zdroj: | CVPR |
Popis: | In this paper, we present a simple yet effective method to automatically transfer textures of clothing images (front and back) to 3D garments worn on top SMPL, in real time. We first automatically compute training pairs of images with aligned 3D garments using a custom non-rigid 3D to 2D registration method, which is accurate but slow. Using these pairs, we learn a mapping from pixels to the 3D garment surface. Our idea is to learn dense correspondences from garment image silhouettes to a 2D-UV map of a 3D garment surface using shape information alone, completely ignoring texture, which allows us to generalize to the wide range of web images. Several experiments demonstrate that our model is more accurate than widely used baselines such as thin-plate-spline warping and image-to-image translation networks while being orders of magnitude faster. Our model opens the door for applications such as virtual try-on, and allows for generation of 3D humans with varied textures which is necessary for learning. Comment: IEEE Conference on Computer Vision and Pattern Recognition |
Databáze: | OpenAIRE |
Externí odkaz: |