An EyeTap video-based featureless projective motion estimation assisted by gyroscopic tracking for wearable computer mediated reality
Autor: | Steve Mann, Chris Aimone, James Fung |
---|---|
Rok vydání: | 2003 |
Předmět: |
business.industry
Computer science Graphics hardware ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION Mobile computing Wearable computer Tracking system Management Science and Operations Research Computer-mediated reality Real image Frame rate Computer Science Applications Computer graphics Hardware and Architecture Motion estimation Computer graphics (images) EyeTap Computer vision Augmented reality Artificial intelligence Graphics business |
Zdroj: | Personal and Ubiquitous Computing. 7:236-248 |
ISSN: | 1617-4917 1617-4909 |
DOI: | 10.1007/s00779-003-0239-6 |
Popis: | In this paper we present a computationally economical method of recovering the projective motion of head mounted cameras or EyeTap devices, for use in wearable computer-mediated reality. The tracking system combines featureless vision and inertial methods in a closed loop system to achieve accurate robust head tracking using inexpensive sensors. The combination of inertial and vision techniques provides the high accuracy visual registration needed for fitting computer graphics onto real images and the robustness to large interframe camera motion due to fast head rotations. Operating on a 1.2 GHz Pentium III wearable computer with graphics accelerated hardware, the system is able to register live video images with less than 2 pixels of error (0.3 degrees) at 12 frames per second. Fast image registration is achieved by offloading computer vision computation onto the graphics hardware, which is readily available on many wearable computer systems. As an application of this tracking approach, we present a system which allows wearable computer users to share views of their current environments that have been stabilised to another viewer's head position. |
Databáze: | OpenAIRE |
Externí odkaz: |