Pattern-Based Cloth Registration and Sparse-View Animation.

Autor: Halimi, Oshri, Stuyck, Tuur, Xiang, Donglai, Bagautdinov, Timur, Wen, He, Kimmel, Ron, Shiratori, Takaaki, Wu, Chenglei, Sheikh, Yaser, Prada, Fabian
Předmět:
Zdroj: ACM Transactions on Graphics; Dec2022, Vol. 41 Issue 6, p1-17, 17p
Abstrakt: We propose a novel multi-view camera pipeline for the reconstruction and registration of dynamic clothing. Our proposed method relies on a specifically designed pattern that allows for precise video tracking in each camera view. We triangulate the tracked points and register the cloth surface in a fine-grained geometric resolution and low localization error. Compared to state-of-the-art methods, our registration exhibits stable correspondence, tracking the same points on the deforming cloth surface along the temporal sequence. As an application, we demonstrate how the use of our registration pipeline greatly improves state-of-the-art pose-based drivable cloth models. Furthermore, we propose a novel model, Garment Avatar, for driving cloth from a dense tracking signal which is obtained from two opposing camera views. The method produces realistic reconstructions which are faithful to the actual geometry of the deforming cloth. In this setting, the user wears a garment with our custom pattern which enables our driving model to reconstruct the geometry. Our code and data are available at https://github.com/HalimiOshri/Pattern-Based-Cloth-Registration-and-Sparse-View-Animation. The released data includes our pattern and registered mesh sequences containing four different subjects and 15k frames in total. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index