Deep Learning for Activity Recognition in Older People Using a Pocket-Worn Smartphone

Autor: Yashi Nan, Nigel H. Lovell, Stephen J. Redmond, Kejia Wang, Kim Delbaere, Kimberley S. van Schooten
Jazyk: angličtina
Rok vydání: 2020
Předmět:
Zdroj: Sensors, Vol 20, Iss 24, p 7195 (2020)
Druh dokumentu: article
ISSN: 1424-8220
DOI: 10.3390/s20247195
Popis: Activity recognition can provide useful information about an older individual’s activity level and encourage older people to become more active to live longer in good health. This study aimed to develop an activity recognition algorithm for smartphone accelerometry data of older people. Deep learning algorithms, including convolutional neural network (CNN) and long short-term memory (LSTM), were evaluated in this study. Smartphone accelerometry data of free-living activities, performed by 53 older people (83.8 ± 3.8 years; 38 male) under standardized circumstances, were classified into lying, sitting, standing, transition, walking, walking upstairs, and walking downstairs. A 1D CNN, a multichannel CNN, a CNN-LSTM, and a multichannel CNN-LSTM model were tested. The models were compared on accuracy and computational efficiency. Results show that the multichannel CNN-LSTM model achieved the best classification results, with an 81.1% accuracy and an acceptable model and time complexity. Specifically, the accuracy was 67.0% for lying, 70.7% for sitting, 88.4% for standing, 78.2% for transitions, 88.7% for walking, 65.7% for walking downstairs, and 68.7% for walking upstairs. The findings indicated that the multichannel CNN-LSTM model was feasible for smartphone-based activity recognition in older people.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje