Human Lower Limb Motion Capture and Recognition Based on Smartphones.

Autor: Duan LT; School of Information and Software Engineering, University of Electronic Science and Technology of China, Chengdu 610054, China.; School of Computer Science, Chengdu University, Chengdu 610106, China., Lawo M; International Graduate School for Dynamics in Logistics, Bremen University, 28359 Bremen, Germany., Wang ZG; School of Information and Software Engineering, University of Electronic Science and Technology of China, Chengdu 610054, China., Wang HY; School of Computer Science, Chengdu University, Chengdu 610106, China.
Jazyk: angličtina
Zdroj: Sensors (Basel, Switzerland) [Sensors (Basel)] 2022 Jul 14; Vol. 22 (14). Date of Electronic Publication: 2022 Jul 14.
DOI: 10.3390/s22145273
Abstrakt: Human motion recognition based on wearable devices plays a vital role in pervasive computing. Smartphones have built-in motion sensors that measure the motion of the device with high precision. In this paper, we propose a human lower limb motion capture and recognition approach based on a Smartphone. We design a motion logger to record five categories of limb activities (standing up, sitting down, walking, going upstairs, and going downstairs) using two motion sensors (tri-axial accelerometer, tri-axial gyroscope). We extract the motion features and select a subset of features as a feature vector from the frequency domain of the sensing data using Fast Fourier Transform (FFT). We classify and predict human lower limb motion using three supervised learning algorithms: Naïve Bayes (NB), K-Nearest Neighbor (KNN), and Artificial Neural Networks (ANNs). We use 670 lower limb motion samples to train and verify these classifiers using the 10-folder cross-validation technique. Finally, we design and implement a live detection system to validate our motion detection approach. The experimental results show that our low-cost approach can recognize human lower limb activities with acceptable accuracy. On average, the recognition rate of NB, KNN, and ANNs are 97.01%, 96.12%, and 98.21%, respectively.
Databáze: MEDLINE
Nepřihlášeným uživatelům se plný text nezobrazuje