A recurrent CNN for online object detection on raw radar frames
Autor: | Decourt, Colin, VanRullen, Rufin, Salle, Didier, Oberlin, Thomas |
---|---|
Rok vydání: | 2022 |
Předmět: | |
Zdroj: | IEEE Transactions on Intelligent Transportation Systems, 2024 |
Druh dokumentu: | Working Paper |
DOI: | 10.1109/TITS.2024.3404076 |
Popis: | Automotive radar sensors provide valuable information for advanced driving assistance systems (ADAS). Radars can reliably estimate the distance to an object and the relative velocity, regardless of weather and light conditions. However, radar sensors suffer from low resolution and huge intra-class variations in the shape of objects. Exploiting the time information (e.g., multiple frames) has been shown to help to capture better the dynamics of objects and, therefore, the variation in the shape of objects. Most temporal radar object detectors use 3D convolutions to learn spatial and temporal information. However, these methods are often non-causal and unsuitable for real-time applications. This work presents RECORD, a new recurrent CNN architecture for online radar object detection. We propose an end-to-end trainable architecture mixing convolutions and ConvLSTMs to learn spatio-temporal dependencies between successive frames. Our model is causal and requires only the past information encoded in the memory of the ConvLSTMs to detect objects. Our experiments show such a method's relevance for detecting objects in different radar representations (range-Doppler, range-angle) and outperform state-of-the-art models on the ROD2021 and CARRADA datasets while being less computationally expensive. Comment: 11 pages, 4 figures, 5 tables |
Databáze: | arXiv |
Externí odkaz: |