Decoding Hand Motor Imagery Tasks Within the Same Limb From EEG Signals Using Deep Learning

Autor: David Achanccaray, Mitsuhiro Hayashibe
Rok vydání: 2020
Předmět:
Zdroj: IEEE Transactions on Medical Robotics and Bionics. 2:692-699
ISSN: 2576-3202
DOI: 10.1109/tmrb.2020.3025364
Popis: Motor imagery (MI) tasks of different body parts have been successfully decoded by conventional classifiers, such as LDA and SVM. On the other hand, decoding MI tasks within the same limb is a challenging problem with these classifiers; however, it would provide more options to control robotic devices. This work proposes to improve the hand MI tasks decoding within the same limb in a brain-computer interface using convolutional neural networks (CNNs); the CNN EEGNet, LDA, and SVM classifiers were evaluated for two (flexion/extension) and three (flexion/extension/grasping) MI tasks. Our approach is the first attempt to apply CNNs for solving this problem to our best knowledge. In addition, visual and electrotactile stimulation were included as BCI training reinforcement after the MI task similar to feedback sessions; then, they were compared. The EEGNet achieved maximum mean accuracies of 78.46% (±12.50%) and 76.72% (±11.67%) for two and three classes, respectively. Outperforming conventional classifiers with results around 60% and 48%, and similar works with results lower than 67% and 75%, respectively. Moreover, the electrical stimulation over the visual stimulus was not significant during the calibration session. The deep learning scheme enhanced the decoding of MI tasks within the same limb against the conventional framework.
Databáze: OpenAIRE