Scaling Gain and Eyeheight While Locomoting in a Large VE

Autor: Thomas H. Carr, Gayathri Narasimham, John J. Rieser, Bobby Bodenheimer, Betsy Williams-Sanders, Timothy P. McNamara
Rok vydání: 2019
Předmět:
Zdroj: Virtual, Augmented and Mixed Reality. Multimodal Interaction ISBN: 9783030216061
HCI (9)
DOI: 10.1007/978-3-030-21607-8_22
Popis: Virtual Environments (VEs) presented through head-mounted displays (HMDs) are often explored on foot. This type of exploration is useful since the inertial cues of physical locomotion aid spatial awareness. However, the size of the VE that can be explored on foot is limited to the dimensions of the tracking space of the HMD unless locomotion is somehow manipulated. This paper presents a system for exploring a large VE on foot when the size of the physical surroundings is small by leveraging people’s natural ability to maintain spatial awareness using their own locomotion. We examine two strategies to increase the explorable size of the virtual space: scaling the translational gain of walking and scaling eyeheight. Translational gain is scaled by changing the relationship between physical and visual translation so that one step forward in physical space corresponds to several steps forward in virtual space. To scale gain higher than ten, it becomes necessary to investigate ways to minimize distracting small physical head motions. We present such a method here. We examine a range of scaling factors and find that we can expect to scale translational gain by a factor of 50. In addition to this finding, this paper also investigates whether scaling eyeheight proportionally to gain increases spatial awareness. We found that providing a map-like overview of the environment does not increase the user’s spatial orientation in the VE.
Databáze: OpenAIRE