Autor: |
Deng, Liangbin, Zhou, Tao, Yin, Bangyong, Guo, Zixuan, Sun, Qiaoling, Wen, He, Jiang, Geng, Li, Yi, Yang, Yuqiu, Wu, Junyao, Cai, Huan, Zhang, Miao, Hou, Nianxing, Deng, Linfeng |
Zdroj: |
IEEE Sensors Journal; September 2024, Vol. 24 Issue: 17 p28028-28035, 8p |
Abstrakt: |
The fusion of millimeter-wave radar and camera sensors is crucial for tracking, positioning, and path planning in autonomous driving systems. Millimeter-wave radar offers relatively high distance resolution but lacks azimuth resolution. On the other hand, stereo cameras provide accurate azimuth measurements and high target detection rates; however, they struggle with estimating target distances. To address this problem, this article proposes a fusion system that combines millimeter-wave radar and stereo cameras. The fusion system leverages spatiotemporal calibration techniques to process data from radar and stereo cameras, enabling effective target matching. The proposed fusion system leverages the matched longitudinal position of the target obtained from the millimeter-wave radar to infer the target’s position data under the stereo camera. Subsequently, an improved evidence decision distance adaptive filtering method is proposed in this study, and the position data of millimeter-wave radar and stereo camera are fused using common mean filtering and optimal weighted filtering methods, effectively improving the lateral positioning accuracy of the target. Statistical experimental results show that compared with the root mean square error (RMSE) of lateral positioning measurements by millimeter-wave radar, the mean filtering and optimal weighted filtering reduced the lateral positioning measurement RMSE by 31.9% and 32.9%, respectively. The lateral positioning RMSE of the proposed filtering method decreased by 35.2%. The fusion method is promising for accurate object positioning. |
Databáze: |
Supplemental Index |
Externí odkaz: |
|