3-D perception and modeling

Autor: Andreas Birk, Kaustubh Pathak, Narunas Vaskevicius, Heiko Buelow, Sören Schwertfeger, Jann Poppinga
Rok vydání: 2009
Předmět:
Zdroj: IEEE Robotics & Automation Magazine. 16:53-60
ISSN: 1070-9932
DOI: 10.1109/mra.2009.934822
Popis: In the context of the 2008 Lunar Robotics Challenge (LRC) of the European Space Agency (ESA), the Jacobs Robotics team investigated three-dimensional (3-D) perception and modeling as an important basis of autonomy in unstructured domains. Concretely, the efficient modeling of the terrain via a 3D laser range finder (LRF) is addressed. The underlying fast extraction of planar surface patches can be used to improve situational awareness of an operator or for path planning. 3D perception and modeling is an important basis for mobile robot operations in planetary exploration scenarios as it supports good situation awareness for motion level teleoperation as well as higher level intelligent autonomous functions. It is hence desirable to get long-range 3D data with high resolution, large field of view, and very fast update rates. 3D LRF have a high potential in this respect. In addition, 3D LRF can operate under conditions where standard vision based methods fail, e.g., under extreme light conditions. However, it is nontrivial to transmit the huge amount of data delivered by a 3D LRF to an operator station or to use this point cloud data as basis for higher level intelligent functions. Based on our participation in the LRC of the ESA, it is shown how the huge amount of 3D point cloud data from 3D LRF can be tremendously reduced. Concretely, large sets of points are replaced by planar surface patches that are fitted into the data in an optimal way. The underlying computations are very efficient and hence suited for online computations onboard of the robot.
Databáze: OpenAIRE