Human–Robot Labeling Framework to Construct Multitype Real-World Datasets

Autor: Ahmed Elsharkawy, Mun Sang Kim
Jazyk: angličtina
Rok vydání: 2022
Předmět:
Zdroj: IEEE Access, Vol 10, Pp 131166-131180 (2022)
Druh dokumentu: article
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2022.3229864
Popis: The rapid development of deep learning object detection models opens great chances to build novel robotics applications. Nevertheless, several flaws are observed when deploying state-of-art object detection models with robots in a real-world environment, attributable to the discrepancy between the robots’ actual observed environments and training data. In this study, we propose a labeling framework that enables a human to guide a robot in creating multitype datasets for objects in the robot’s surroundings. Our labeling framework ensures no usage of labeling tools (e.g., software) but a direct hand-free gesture-based interaction between humans and robots. Using our labeling framework, we can enormously reduce the effort and time required to collect and label two-dimensional and three-dimensional data. Our system was implemented using a single RGB-D sensor to interact with a mobile robot, position feature points for labeling, and track the mobile robot’s movement. Several robot operating system nodes were designed to allow a compact structure for our labeling framework. We assessed different components in our framework, demonstrating its effectiveness in generating quality real-world labeled data for color images and point clouds. It also reveals how our framework could be used in solving object detection problems for mobile robots. Moreover, to evaluate our system considering human factors, we conducted a user study, where participants compared our framework and conventional labeling methods. The results show several significant enhancements for the usability factors and confirm our framework’s suitability to help a regular user build custom knowledge for mobile robots effortlessly.
Databáze: Directory of Open Access Journals