Real-time non-rigid reconstruction using an RGB-D camera
Autor: | Christian Theobalt, Christoph Rehmann, Shahram Izadi, Michael Zollhöfer, Matthias Nießner, Chenglei Wu, Charles Loop, Matthew Fisher, Christopher Zach, Andrew Fitzgibbon, Marc Stamminger |
---|---|
Rok vydání: | 2014 |
Předmět: |
Computer science
business.industry Graphics hardware ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION 020207 software engineering 02 engineering and technology Kinematics Computer Graphics and Computer-Aided Design Motion capture Computer graphics Robustness (computer science) Computer Science::Computer Vision and Pattern Recognition Computer graphics (images) 0202 electrical engineering electronic engineering information engineering RGB color model 020201 artificial intelligence & image processing Computer vision Artificial intelligence business Stereo camera Surface reconstruction ComputingMethodologies_COMPUTERGRAPHICS Parametric statistics |
Zdroj: | ACM Transactions on Graphics. 33:1-12 |
ISSN: | 1557-7368 0730-0301 |
DOI: | 10.1145/2601097.2601165 |
Popis: | We present a combined hardware and software solution for markerless reconstruction of non-rigidly deforming physical objects with arbitrary shape in real-time . Our system uses a single self-contained stereo camera unit built from off-the-shelf components and consumer graphics hardware to generate spatio-temporally coherent 3D models at 30 Hz. A new stereo matching algorithm estimates real-time RGB-D data. We start by scanning a smooth template model of the subject as they move rigidly. This geometric surface prior avoids strong scene assumptions, such as a kinematic human skeleton or a parametric shape model. Next, a novel GPU pipeline performs non-rigid registration of live RGB-D data to the smooth template using an extended non-linear as-rigid-as-possible (ARAP) framework. High-frequency details are fused onto the final mesh using a linear deformation model. The system is an order of magnitude faster than state-of-the-art methods, while matching the quality and robustness of many offline algorithms. We show precise real-time reconstructions of diverse scenes, including: large deformations of users' heads, hands, and upper bodies; fine-scale wrinkles and folds of skin and clothing; and non-rigid interactions performed by users on flexible objects such as toys. We demonstrate how acquired models can be used for many interactive scenarios, including re-texturing, online performance capture and preview, and real-time shape and motion re-targeting. |
Databáze: | OpenAIRE |
Externí odkaz: |