Comparing Non-Visual and Visual Guidance Methods for Narrow Field of View Augmented Reality Displays
Autor: | Tom David Eibich, Ernst Kruijff, Christina Trepkowski, Johannes Schöning, Alexander Marquardt, Jens Maiero |
---|---|
Rok vydání: | 2020 |
Předmět: |
Adult
Male Situation awareness Computer science User-Computer Interface Young Adult Data visualization Human–computer interaction Task Performance and Analysis Computer Graphics Humans Augmented Reality business.industry Virtual Reality Equipment Design Awareness Middle Aged Computer Graphics and Computer-Aided Design Visualization Signal Processing Task analysis Augmented reality Female Computer Vision and Pattern Recognition Cues business Software |
Zdroj: | IEEE transactions on visualization and computer graphics. 26(12) |
ISSN: | 1941-0506 |
Popis: | Current augmented reality displays still have a very limited field of view compared to the human vision. In order to localize out-of-view objects, researchers have predominantly explored visual guidance approaches to visualize information in the limited (in-view) screen space. Unfortunately, visual conflicts like cluttering or occlusion of information often arise, which can lead to search performance issues and a decreased awareness about the physical environment. In this paper, we compare an innovative non-visual guidance approach based on audio-tactile cues with the state-of-the-art visual guidance technique EyeSee360 for localizing out-of-view objects in augmented reality displays with limited field of view. In our user study, we evaluate both guidance methods in terms of search performance and situation awareness. We show that although audio-tactile guidance is generally slower than the well-performing EyeSee360 in terms of search times, it is on a par regarding the hit rate. Even more so, the audio-tactile method provides a significant improvement in situation awareness compared to the visual approach. |
Databáze: | OpenAIRE |
Externí odkaz: |