Autor: |
Ethan R. Jones, Winyu Chinthammit, Weidong Huang, Ulrich Engelke, Christopher Lueg |
Jazyk: |
angličtina |
Rok vydání: |
2018 |
Předmět: |
|
Zdroj: |
Symmetry, Vol 10, Iss 12, p 680 (2018) |
Druh dokumentu: |
article |
ISSN: |
2073-8994 |
DOI: |
10.3390/sym10120680 |
Popis: |
Control of robot arms is often required in engineering and can be performed by using different methods. This study examined and symmetrically compared the use of a controller, eye gaze tracker and a combination thereof in a multimodal setup for control of a robot arm. Tasks of different complexities were defined and twenty participants completed an experiment using these interaction modalities to solve the tasks. More specifically, there were three tasks: the first was to navigate a chess piece from a square to another pre-specified square; the second was the same as the first task, but required more moves to complete; and the third task was to move multiple pieces to reach a solution to a pre-defined arrangement of the pieces. Further, while gaze control has the potential to be more intuitive than a hand controller, it suffers from limitations with regard to spatial accuracy and target selection. The multimodal setup aimed to mitigate the weaknesses of the eye gaze tracker, creating a superior system without simply relying on the controller. The experiment shows that the multimodal setup improves performance over the eye gaze tracker alone ( p < 0.05 ) and was competitive with the controller only setup, although did not outperform it ( p > 0.05 ). |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|
Nepřihlášeným uživatelům se plný text nezobrazuje |
K zobrazení výsledku je třeba se přihlásit.
|