DynaPix SLAM: A Pixel-Based Dynamic Visual SLAM Approach
Autor: | Xu, Chenghao, Bonetto, Elia, Ahmad, Aamir |
---|---|
Rok vydání: | 2023 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | Visual Simultaneous Localization and Mapping (V-SLAM) methods achieve remarkable performance in static environments, but face challenges in dynamic scenes where moving objects severely affect their core modules. To avoid this, dynamic V-SLAM approaches often leverage semantic information, geometric constraints, or optical flow. However, these methods are limited by imprecise estimations and their reliance on the accuracy of deep-learning models. Moreover, predefined thresholds for static/dynamic classification, the a-priori selection of dynamic object classes, and the inability to recognize unknown or unexpected moving objects, often degrade their performance. To address these limitations, we introduce DynaPix, a novel semantic-free V-SLAM system based on per-pixel motion probability estimation and an improved pose optimization process. The per-pixel motion probability is estimated using a static background differencing method on image data and optical flows computed on splatted frames. With DynaPix, we fully integrate these probabilities into map point selection and apply them through weighted bundle adjustment within the tracking and optimization modules of ORB-SLAM2. We thoroughly evaluate our method using the GRADE and TUM RGB-D datasets, showing significantly lower trajectory errors and longer tracking times in both static and dynamic sequences. The source code, datasets, and results are available at https://dynapix.is.tue.mpg.de/. Comment: Chenghao Xu and Elia Bonetto contributed equally to this work as first authors. 19 pages, 4 tables, 6 figures. Includes supplementary material |
Databáze: | arXiv |
Externí odkaz: |