Acceleration-based Activity Recognition of Repetitive Works with Lightweight Ordered-work Segmentation Network

Autor: Naoya Yoshimura, Takuya Maekawa, Takahiro Hara, Atsushi Wada, Yasuo Namioka
Rok vydání: 2022
Předmět:
Zdroj: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 6:1-39
ISSN: 2474-9567
Popis: This study presents a new neural network model for recognizing manual works using body-worn accelerometers in industrial settings, named Lightweight Ordered-work Segmentation Network (LOS-Net). In industrial domains, a human worker typically repetitively performs a set of predefined processes, with each process consisting of a sequence of activities in a predefined order. State-of-the-art activity recognition models, such as encoder-decoder models, have numerous trainable parameters, making their training difficult in industrial domains because of the consequent substantial cost for preparing a large amount of labeled data. In contrast, the LOS-Net is designed to be trained on a limited amount of training data. Specifically, the decoder in the LOS-Net has few trainable parameters and is designed to capture only the necessary information for precise recognition of ordered works. These are (i) the boundary information between consecutive activities, because a transition in the performed activities is generally associated with the trend change of the sensor data collected during the manual works and (ii) long-term context regarding the ordered works, e.g., information about the previous and next activity, which is useful for recognizing the current activity. This information is obtained by introducing a module that can collect it at distant time steps using few trainable parameters. Moreover, the LOS-Net can refine the activity estimation by the decoder by incorporating prior knowledge regarding the order of activities. We demonstrate the effectiveness of the LOS-Net using sensor data collected from workers in actual factories and a logistics center, and show that it can achieve state-of-the-art performance.
Databáze: OpenAIRE