Robust lifelong visual navigation and mapping

Autor: Pascoe, G
Přispěvatelé: Newman, P, Maddern, W
Jazyk: angličtina
Rok vydání: 2018
Předmět:
Popis: The ability to precisely determine one’s location in within the world (localisation) is a key requirement for any robot wishing to navigate through the world. For long-term operation, such a localisation system must be robust to changes in the environment, both short term (eg. traffic, weather) and long term (eg. seasons). This thesis presents two methods for performing such localisation using cameras — small, cheap, lightweight sensors that are universally available. Whilst many image-based localisation systems have been proposed in the past, they generally rely on either feature matching, which fails under many degradations such as motion blur, or on photometric consistency, which fails under changing illumination. The methods we propose here directly align images with a dense prior map. The first method uses maps synthesised from a combination of LIDAR scanners to generate geometry and cameras to generate appearance, whilst the second uses vision for both mapping and localisation. Both make use of an information-theoretic metric, Normalised Information Distance (NID), for image alignment, relaxing the appearance constancy assumption inherent in photometric methods. Our methods require significant computational resources, but through the use of commodity GPUs, we are able to run them at a rate of 8-10Hz. Our GPU implementations make use of low level OpenGL, enabling compatibility across almost any GPU hardware. We also present a method for calibrating multi-sensor systems, enabling the joint use of cameras and LIDAR for mapping. Through experiments on both synthetic data and real-world data from over 100km of driving outdoors, we demonstrate the robustness of our localisation system to large variations in appearance. Comparisons with state-of-the-art feature-based and direct methods show that ours is significantly more robust, whilst maintaining similar precision.
Databáze: OpenAIRE