Ear-AR
Autor: | Sheng Shen, Yu-Lin Wei, Zhijian Yang, Romit Roy Choudhury |
---|---|
Rok vydání: | 2020 |
Předmět: |
0209 industrial biotechnology
Orientation (computer vision) Computer science 010401 analytical chemistry Wearable computer 02 engineering and technology Sensor fusion 01 natural sciences 0104 chemical sciences 020901 industrial engineering & automation Match moving Software deployment Inertial measurement unit Human–computer interaction Dead reckoning Augmented reality |
Zdroj: | MobiCom |
DOI: | 10.1145/3372224.3419213 |
Popis: | This paper aims to use modern earphones as a platform for acoustic augmented reality (AAR). We intend to play 3D audio-annotations in the user's ears as she moves and looks at AAR objects in the environment. While companies like Bose and Microsoft are beginning to release such capabilities, they are intended for outdoor environments. Our system aims to explore the challenges indoors, without requiring any infrastructure deployment. Our core idea is two-fold. (1) We jointly use the inertial sensors (IMUs) in earphones and smartphones to estimate a user's indoor location and gazing orientation. (2) We play 3D sounds in the earphones and exploit the human's responses to (re)calibrate errors in location and orientation. We believe this fusion of IMU and acoustics is novel, and could be an important step towards indoor AAR. Our system, Ear-AR, is tested on 7 volunteers invited to an AAR exhibition - like a museum - that we set up in our building's lobby and lab. Across 60 different test sessions, the volunteers browsed different subsets of 24 annotated objects as they walked around. Results show that Ear-AR plays the correct audio-annotations with good accuracy. The user-feedback is encouraging and points to further areas of research and applications. |
Databáze: | OpenAIRE |
Externí odkaz: |