Robust tracking and detection based on radar camera fusion filtering in urban autonomous driving.

Autor: Baek, Seoha, Kim, Jongho, Yi, Kyongsu
Zdroj: Intelligent Service Robotics; Nov2024, Vol. 17 Issue 6, p1125-1141, 17p
Abstrakt: The primary goal of autonomous vehicles is vehicle safety, achieved through vehicle planning and control based on the understanding of the driving environment. To understand the surrounding environment, most autonomous driving systems utilize lidar and cameras for tracking nearby objects. However, both sensors are effective only within a short range of 50 m and lack robustness in adverse conditions, thereby failing to ensure the stability of object tracking and state estimation. Therefore, to enable continuous tracking over long distances, this paper proposes a robust tracking and detection system based on radar-camera fusion in urban autonomous driving. In this system, radar is used as the primary sensor to achieve uninterrupted tracking over longer distances, and cameras are employed to compensate for the relatively inaccurate measurements of the radar, thereby enhancing tracking accuracy. To achieve robust tracking results in urban environments and from surrounding objects, the track management process utilizes velocity values, unique to radar compared to other sensors, to establish associations between tracklets and measurements. Subsequently, the process selects valid tracklets among those unmatched by considering the relationship between the time of initialization for each tracklet and the current time. To enhance the accuracy of tracking results, the state update process integrates radar and camera information using a decentralized Kalman filter. This filter structure not only improves accuracy but also ensures robustness against sensor failure and real-time performance in the fusion process. The algorithm is implemented in autonomous vehicles equipped with radar and low-cost cameras and is validated on test tracks, as well as in urban and highway environments. The experimental results confirm that the algorithm continuously tracks targets over long distances and demonstrates enhanced tracking results through fusion with the camera, conducting the process efficiently and robustly. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index