Motion Estimation Using Region-Level Segmentation and Extended Kalman Filter for Autonomous Driving

Autor: Yingping Huang, Zhiyang Guo, Baigan Zhao, Fuzhi Hu, Hongjian Wei, Rui Zhang
Jazyk: angličtina
Rok vydání: 2021
Předmět:
Zdroj: Remote Sensing; Volume 13; Issue 9; Pages: 1828
Remote Sensing, Vol 13, Iss 1828, p 1828 (2021)
ISSN: 2072-4292
DOI: 10.3390/rs13091828
Popis: Motion estimation is crucial to predict where other traffic participants will be at a certain period of time, and accordingly plan the route of the ego-vehicle. This paper presents a novel approach to estimate the motion state by using region-level instance segmentation and extended Kalman filter (EKF). Motion estimation involves three stages of object detection, tracking and parameter estimate. We first use a region-level segmentation to accurately locate the object region for the latter two stages. The region-level segmentation combines color, temporal (optical flow), and spatial (depth) information as the basis for segmentation by using super-pixels and Conditional Random Field. The optical flow is then employed to track the feature points within the object area. In the stage of parameter estimate, we develop a relative motion model of the ego-vehicle and the object, and accordingly establish an EKF model for point tracking and parameter estimate. The EKF model integrates the ego-motion, optical flow, and disparity to generate optimized motion parameters. During tracking and parameter estimate, we apply edge point constraint and consistency constraint to eliminate outliers of tracking points so that the feature points used for tracking are ensured within the object body and the parameter estimates are refined by inner points. Experiments have been conducted on the KITTI dataset, and the results demonstrate that our method presents excellent performance and outperforms the other state-of-the-art methods either in object segmentation and parameter estimate.
Databáze: OpenAIRE
Nepřihlášeným uživatelům se plný text nezobrazuje