Zobrazeno 1 - 10
of 349
pro vyhledávání: '"Otaduy, Miguel A."'
Publikováno v:
Proceedings of 3DBODY.TECH 2023 - 14th International Conference and Exhibition on 3D Body Scanning and Processing Technologies, Lugano, Switzerland, October 2023
Computer models of humans are ubiquitous throughout computer animation and computer vision. However, these models rarely represent the dynamics of human motion, as this requires adding a complex layer that solves body motion in response to external i
Externí odkaz:
http://arxiv.org/abs/2310.18206
We propose a novel formulation of elastic materials based on high-order interpolants, which fits accurately complex elastic behaviors, but remains conservative. The proposed high-order interpolants can be regarded as a high-dimensional extension of r
Externí odkaz:
http://arxiv.org/abs/2303.03120
We present a self-supervised method to learn dynamic 3D deformations of garments worn by parametric human bodies. State-of-the-art data-driven approaches to model 3D garment deformations are trained using supervised strategies that require large data
Externí odkaz:
http://arxiv.org/abs/2204.02219
Autor:
Wang, Jiayi, Mueller, Franziska, Bernard, Florian, Sorli, Suzanne, Sotnychenko, Oleksandr, Qian, Neng, Otaduy, Miguel A., Casas, Dan, Theobalt, Christian
Publikováno v:
ACM Transactions on Graphics (TOG) 39 (6), 1-16, 2020
Tracking and reconstructing the 3D pose and geometry of two hands in interaction is a challenging problem that has a high relevance for several human-computer interaction applications, including AR/VR, robotics, or sign language recognition. Existing
Externí odkaz:
http://arxiv.org/abs/2106.11725
Autor:
Mueller, Franziska, Davis, Micah, Bernard, Florian, Sotnychenko, Oleksandr, Verschoor, Mickeal, Otaduy, Miguel A., Casas, Dan, Theobalt, Christian
We present a novel method for real-time pose and shape reconstruction of two strongly interacting hands. Our approach is the first two-hand tracking solution that combines an extensive list of favorable properties, namely it is marker-less, uses a si
Externí odkaz:
http://arxiv.org/abs/2106.08059
We propose a new generative model for 3D garment deformations that enables us to learn, for the first time, a data-driven method for virtual try-on that effectively addresses garment-body collisions. In contrast to existing methods that require an un
Externí odkaz:
http://arxiv.org/abs/2105.06462
We present SoftSMPL, a learning-based method to model realistic soft-tissue dynamics as a function of body shape and motion. Datasets to learn such task are scarce and expensive to generate, which makes training models prone to overfitting. At the co
Externí odkaz:
http://arxiv.org/abs/2004.00326
This paper presents a learning-based clothing animation method for highly efficient virtual try-on simulation. Given a garment, we preprocess a rich database of physically-based dressed character simulations, for multiple body shapes and animations.
Externí odkaz:
http://arxiv.org/abs/1903.07190
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.