Generation an Annotated Dataset of Human Poses for Deep Learning Networks Based on Motion Tracking System

Autor: Igor Mikhailovich Artamonov, Alexander Efitorov, Vladimir Shirokii, Yana Nikolaevna Artamonova, Oleg Vasilyev
Rok vydání: 2020
Předmět:
Zdroj: Advances in Neural Computation, Machine Learning, and Cognitive Research IV ISBN: 9783030605766
Popis: In this paper, we propose an original method for relatively fast generation an annotated data set of human's poses for deep neural networks training based on 3D motion capture system. Compared to default pose detection DNNs trained on commonly used open datasets the method makes possible to recognize specific poses and actions more accurately and decreases need for additional image processing operations aimed at correction of various detection errors inherent to these DNNs. We used preinstalled IR motion capture system with reflective passive tags not to capture movement itself but to extract human keypoints at 3D space and got video record at corresponding timestamps. Obtained 3D trajectories were synchronized in time and space with streams from several cameras using approaches of mutual camera calibration and photogrammetry. It allowed us to accurately project keypoint from 3D space to 2D video frame plane and generate human pose annotations for recorded video and train deep neural network based on this dataset.
Databáze: OpenAIRE