Autor: |
Yi Wang, Ge Yu, Guan-Yang Liu, Chao Huang, Yu-Hang Wang |
Jazyk: |
angličtina |
Rok vydání: |
2021 |
Předmět: |
|
Zdroj: |
Actuators, Vol 10, Iss 9, p 229 (2021) |
Druh dokumentu: |
article |
ISSN: |
2076-0825 |
DOI: |
10.3390/act10090229 |
Popis: |
On-orbit astronauts and scientists on the ground need to cooperate closely, to complete space science experiments efficiently. However, for the increasingly diverse space science experiments, scientists are unable to train astronauts on the ground about the details of each experiment. The traditional interaction of visual and auditory channels is not enough for scientists to directly guide astronauts to experimentalize. An intuitive and transparent interaction interface between scientists and astronauts has to be built to meet the requirements of space science experiments. Therefore, this paper proposed a vibrotactile guidance system for cooperation between scientists and astronauts. We utilized Kinect V2 sensors to track the movements of the participants of space science experiments, process data in the virtual experimental environment developed by Unity 3D, and provide astronauts with different guidance instructions using the wearable vibrotactile device. Compared with other schemes using only visual and auditory channels, our approach provides more direct and more efficient guidance information that astronauts perceive is what they need to perform different tasks. Three virtual space science experiment tasks verified the feasibility of the vibrotactile operational guidance system. Participants were able to complete the experimental task with a short period of training, and the experimental results show that the method has an application prospect. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|