Human Motion Transfer from Poses in the Wild
Autor: | Menglei Chai, Xiaohui Shen, Jian Ren, Jianchao Yang, Chen Fang, Sergey Tulyakov |
---|---|
Rok vydání: | 2020 |
Předmět: |
Network architecture
Pixel Computer science business.industry 020207 software engineering 02 engineering and technology 010501 environmental sciences Machine learning computer.software_genre Translation (geometry) 01 natural sciences Bridge (nautical) Motion (physics) Task (project management) Robustness (computer science) 0202 electrical engineering electronic engineering information engineering Artificial intelligence business computer 0105 earth and related environmental sciences Generator (mathematics) |
Zdroj: | Computer Vision – ECCV 2020 Workshops ISBN: 9783030670696 ECCV Workshops (3) |
Popis: | In this paper, we tackle the problem of human motion transfer, where we synthesize novel motion video for a target person that imitates the movement from a reference video. It is a video-to-video translation task in which the estimated poses are used to bridge two domains. Despite substantial progress on the topic, there exist several problems with the previous methods. First, there is a domain gap between training and testing pose sequences–the model is tested on poses it has not seen during training, such as difficult dancing moves. Furthermore, pose detection errors are inevitable, making the job of the generator harder. Finally, generating realistic pixels from sparse poses is challenging in a single step. To address these challenges, we introduce a novel pose-to-video translation framework for generating high-quality videos that are temporally coherent even for in-the-wild pose sequences unseen during training. We propose a pose augmentation method to minimize the training-test gap, a unified paired and unpaired learning strategy to improve the robustness to detection errors, and two-stage network architecture to achieve superior texture quality. To further boost research on the topic, we build two human motion datasets. Finally, we show the superiority of our approach over the state-of-the-art studies through extensive experiments and evaluations on different datasets. |
Databáze: | OpenAIRE |
Externí odkaz: |