Multi-Model Long Short-Term Memory Network for Gait Recognition Using Window-Based Data Segment

Autor: Thuc D. Nguyen, Hyunil Kim, Lam Dai Tran, Thang Hoang, Deokjai Choi
Rok vydání: 2021
Předmět:
Zdroj: IEEE Access, Vol 9, Pp 23826-23839 (2021)
ISSN: 2169-3536
DOI: 10.1109/access.2021.3056880
Popis: Inertial Measurement Units (IMUs)-based gait analysis is a promising and attractive approach for user recognition. Recently, the adoption of deep learning techniques has gained significant performance improvement. However, most existing studies focused on exploiting the spatial information of gait data (using Convolutional Neural Network (CNN)) while the temporal part received little attention. In this study, we propose a new multi-model Long Short-term Memory (LSTM) network for learning the gait temporal features. First , we observe that LSTM is able to capture the pattern hidden inside the gait data sequences that are out-of-synchronization. Thus, instead of using the gait cycle-based segment, our model accepts the gait cycle-free segment ( i.e., fixed-length window) as the input. By this, the classification task does not depend on the gait cycle detection task, which usually suffers from noise and bias. Second , we propose a new LSTM network architecture, in which, one LSTM is used for each gait data channel and a group of consecutive signals is processed in each step. This strategy allows the network to effectively handle the long input data sequence and achieve improved performance compared to existing LSTM-based gait models. In addition, besides using the LSTM alone, we extend it by combining with a CNN model to construct a hybrid network, which further improves the recognition performance. We evaluated our LSTM and hybrid networks under different settings using the whuGAIT and OU-ISIR datasets. The experiments showed that our LSTM network outperformed the existing LSTM networks, and its combination with CNN established new state-of-the-art performance on both the verification and identification tasks.
Databáze: OpenAIRE