FAMOUS: High-Fidelity Monocular 3D Human Digitization Using View Synthesis

Autor: Hema, Vishnu Mani, Aich, Shubhra, Haene, Christian, Bazin, Jean-Charles, de la Torre, Fernando
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
DOI: 10.1007/978-3-031-73007-8_4
Popis: The advancement in deep implicit modeling and articulated models has significantly enhanced the process of digitizing human figures in 3D from just a single image. While state-of-the-art methods have greatly improved geometric precision, the challenge of accurately inferring texture remains, particularly in obscured areas such as the back of a person in frontal-view images. This limitation in texture prediction largely stems from the scarcity of large-scale and diverse 3D datasets, whereas their 2D counterparts are abundant and easily accessible. To address this issue, our paper proposes leveraging extensive 2D fashion datasets to enhance both texture and shape prediction in 3D human digitization. We incorporate 2D priors from the fashion dataset to learn the occluded back view, refined with our proposed domain alignment strategy. We then fuse this information with the input image to obtain a fully textured mesh of the given person. Through extensive experimentation on standard 3D human benchmarks, we demonstrate the superior performance of our approach in terms of both texture and geometry. Code and dataset is available at https://github.com/humansensinglab/FAMOUS.
Databáze: arXiv