Visual-model-based, real-time 3D pose tracking for autonomous navigation: methodology and experiments
Autor: | Beno Benhabib, Hans de Ruiter |
---|---|
Rok vydání: | 2008 |
Předmět: | |
Zdroj: | Autonomous Robots. 25:267-286 |
ISSN: | 1573-7527 0929-5593 |
DOI: | 10.1007/s10514-008-9094-7 |
Popis: | This paper presents a novel 3D-model-based computer-vision method for tracking the full six degree-of-freedom (dof) pose (position and orientation) of a rigid body, in real-time. The methodology has been targeted for autonomous navigation tasks, such as interception of or rendezvous with mobile targets. Tracking an object's complete six-dof pose makes the proposed algorithm useful even when targets are not restricted to planar motion (e.g., flying or rough-terrain navigation). Tracking is achieved via a combination of textured model projection and optical flow. The main contribution of our work is the novel combination of optical flow with z-buffer depth information that is produced during model projection. This allows us to achieve six-dof tracking with a single camera. A localized illumination normalization filter also has been developed in order to improve robustness to shading. Real-time operation is achieved using GPU-based filters and a new data-reduction algorithm based on colour-gradient redundancy, which was developed within the framework of our project. Colour-gradient redundancy is an important property of colour images, namely, that the gradients of all colour channels are generally aligned. Exploiting this property provides a threefold increase in speed. A processing rate of approximately 80 to 100 fps has been obtained in our work when utilizing synthetic and real target-motion sequences. Sub-pixel accuracies were obtained in tests performed under different lighting conditions. |
Databáze: | OpenAIRE |
Externí odkaz: |