Radar + RGB Fusion For Robust Object Detection In Autonomous Vehicle
Autor: | Axel Vierling, Karsten Berns, Ritu Yadav |
---|---|
Rok vydání: | 2020 |
Předmět: |
Computer science
business.industry 05 social sciences Feature extraction ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION 010501 environmental sciences 01 natural sciences Object detection law.invention Noise Feature (computer vision) law Radar imaging 0502 economics and business Pyramid RGB color model Computer vision Pyramid (image processing) Artificial intelligence 050207 economics Radar business 0105 earth and related environmental sciences |
Zdroj: | ICIP |
Popis: | This paper presents two variations of architecture referred to as RANet and BIRANet. The proposed architecture aims to use radar signal data along with RGB camera images to form a robust detection network that works efficiently, even in variable lighting and weather conditions such as rain, dust, fog, and others. First, radar information is fused in the feature extractor network. Second, radar points are used to generate guided anchors. Third, a method is proposed to improve region proposal network [1] targets. BIRANet yields 72.3/75.3% average AP/AR on the NuScenes [2] dataset, which is better than the performance of our base network Faster-RCNN with Feature pyramid network(FFPN) [3]. RANet gives 69/71.9% average AP/AR on the same dataset, which is reasonably acceptable performance. Also, both BIRANet and RANet are evaluated to be robust towards the noise. |
Databáze: | OpenAIRE |
Externí odkaz: |