TraHGR: Transformer for Hand Gesture Recognition via Electromyography

Autor: Soheil Zabihi, Elahe Rahimian, Amir Asif, Arash Mohammadi
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: IEEE Transactions on Neural Systems and Rehabilitation Engineering, Vol 31, Pp 4211-4224 (2023)
Druh dokumentu: article
ISSN: 1558-0210
DOI: 10.1109/TNSRE.2023.3324252
Popis: Deep learning-based Hand Gesture Recognition (HGR) via surface Electromyogram (sEMG) signals have recently shown considerable potential for development of advanced myoelectric-controlled prosthesis. Although deep learning techniques can improve HGR accuracy compared to their classical counterparts, classifying hand movements based on sparse multichannel sEMG signals is still a challenging task. Furthermore, existing deep learning approaches, typically, include only one model as such can hardly extract representative features. In this paper, we aim to address this challenge by capitalizing on the recent advances in hybrid models and transformers. In other words, we propose a hybrid framework based on the transformer architecture, which is a relatively new and revolutionizing deep learning model. The proposed hybrid architecture, referred to as the Transformer for Hand Gesture Recognition (TraHGR), consists of two parallel paths followed by a linear layer that acts as a fusion center to integrate the advantage of each module. We evaluated the proposed architecture TraHGR based on the commonly used second Ninapro dataset, referred to as the DB2. The sEMG signals in the DB2 dataset are measured in real-life conditions from 40 healthy users, each performing 49 gestures. We have conducted an extensive set of experiments to test and validate the proposed TraHGR architecture, and compare its achievable accuracy with several recently proposed HGR classification algorithms over the same dataset. We have also compared the results of the proposed TraHGR architecture with each individual path and demonstrated the distinguishing power of the proposed hybrid architecture. The recognition accuracies of the proposed TraHGR architecture for the window of size 200ms and step size of 100ms are 86.00%, 88.72%, 81.27%, and 93.74%, which are 2.30%, 4.93%, 8.65%, and 4.20% higher than the state-of-the-art performance for DB2 (49 gestures), DB2-B (17 gestures), DB2-C (23 gestures), and DB2-D (9 gestures), respectively.
Databáze: Directory of Open Access Journals