VRsneaky: Stepping into an audible virtual world with gait-aware auditory feedback

Autor: Dietz, Felix, Hoppe, Matthias, Karolus, Jakob, Wozniak, Paweł W., Schmidt, Albrecht, Machulla, Tonja, Sub Human-Centered Computing, Human-Centered Computing
Přispěvatelé: Sub Human-Centered Computing, Human-Centered Computing
Jazyk: angličtina
Rok vydání: 2020
Předmět:
Zdroj: CHI EA '20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM)
CHI EA '20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems
CHI Extended Abstracts
Popis: New VR experiences allow users to walk extensively in the virtual space. Bigger tracking spaces, treadmills and redirected walking solutions are now available. Yet, certain connections to the user's movement are still not made. Here, we specifically see a shortcoming of representations of locomotion feedback in state-of-the-art VR setups. As shown in our paper, providing synchronized step sounds is important for involving the user further into the experience and virtual world, but is often neglected. VRsneaky detects the user's gait and plays synchronized gait-aware step sounds accordingly by attaching force sensing resistors (FSR) and accelerometers to the user's shoe. In an exciting bank robbery the user will try to rob the bank behind a guards back. The tension will increase as the user has to be aware of each step in this atmospheric experience. Each step will remind the user to pay attention to every movement, as each step will be represented using adaptive step sounds resulting in different noise levels.
Databáze: OpenAIRE