A shared robot control system combining augmented reality and motor imagery brain-computer interfaces with eye tracking.

Autor: Dillen A; Human Physiology and Sports Physiotherapy Research Group, Vrije Universiteit Brussel, Pleinlaan 2, Brussel, 1050, BELGIUM., Omidi M; Brussels Human Robotics Research Center (BruBotics), Vrije Universiteit Brussel, Pleinlaan 2, Brussel, Brussel, 1050, BELGIUM., Ghaffari F; Equipe Traitement de l'Information et Systèmes, CY Cergy Paris University, 6 Rue du Ponceau, Cergy-Pontoise, 95000 , FRANCE., Vanderborght B; Brussels Human Robotics Research Center (BruBotics), Vrije Universiteit Brussel, Pleinlaan 2, Brussel, Brussel, 1050, BELGIUM., Roelands B; Human Physiology and Sports Physiotherapy Research Group, Vrije Universiteit Brussel, Pleinlaan 2, Brussel, Brussel, 1050, BELGIUM., Romain O; Equipe Traitement de l'Information et Systèmes, CY Cergy Paris University, 6 Rue du Ponceau, Cergy-Pontoise, 95000 , FRANCE., Nowé A; Artificial Intelligence research group, Vrije Universiteit Brussel, Pleinlaan 2, Brussel, Brussel, 1050, BELGIUM., De Pauw K; Human Physiology and Sports Physiotherapy Research Group, Vrije Universiteit Brussel, Pleinlaan 2, Brussel, Brussel, 1050, BELGIUM.
Jazyk: angličtina
Zdroj: Journal of neural engineering [J Neural Eng] 2024 Sep 25. Date of Electronic Publication: 2024 Sep 25.
DOI: 10.1088/1741-2552/ad7f8d
Abstrakt: Objective : Brain-computer interface (BCI) control systems monitor neural activity to detect the user's intentions, enabling device control through mental imagery. Despite their potential, decoding neural activity in real-world conditions poses significant challenges, making BCIs currently impractical compared to traditional interaction methods. This study introduces a novel motor imagery (MI) BCI control strategy for operating a physically assistive robotic arm, addressing the difficulties of MI decoding from electroencephalogram (EEG) signals, which are inherently non-stationary and vary across individuals. Approach : A proof-of-concept BCI control system was developed using commercially available hardware, integrating MI with eye tracking in an augmented reality (AR) user interface to facilitate a shared control approach. This system proposes actions based on the user's gaze, enabling selection through imagined movements. A user study was conducted to evaluate the system's usability, focusing on its effectiveness and efficiency. Main results: Participants performed tasks that simulated everyday activities with the robotic arm, demonstrating the shared control system's feasibility and practicality in real-world scenarios. Despite low online decoding performance (mean accuracy: 0.52 9, F1: 0.29, Cohen's Kappa: 0.12), participants achieved a mean success rate of 0.83 in the final phase of the user study when given 15 minutes to complete the evaluation tasks. The success rate dropped below 0.5 when a 5-minute cutoff time was selected. Significance : These results indicate that integrating AR and eye tracking can significantly enhance the usability of BCI systems, despite the complexities of MI-EEG decoding. While efficiency is still low, the effectiveness of our approach was verified. This suggests that BCI systems have the potential to become a viable interaction modality for everyday applications in the future.
(Creative Commons Attribution license.)
Databáze: MEDLINE