Adversarial Attacks on Camera-LiDAR Models for 3D Car Detection

Autor: Abdelfattah, Mazen, Yuan, Kaiwen, Wang, Z. Jane, Ward, Rabab
Rok vydání: 2021
Předmět:
Druh dokumentu: Working Paper
Popis: Most autonomous vehicles (AVs) rely on LiDAR and RGB camera sensors for perception. Using these point cloud and image data, perception models based on deep neural nets (DNNs) have achieved state-of-the-art performance in 3D detection. The vulnerability of DNNs to adversarial attacks has been heavily investigated in the RGB image domain and more recently in the point cloud domain, but rarely in both domains simultaneously. Multi-modal perception systems used in AVs can be divided into two broad types: cascaded models which use each modality independently, and fusion models which learn from different modalities simultaneously. We propose a universal and physically realizable adversarial attack for each type, and study and contrast their respective vulnerabilities to attacks. We place a single adversarial object with specific shape and texture on top of a car with the objective of making this car evade detection. Evaluating on the popular KITTI benchmark, our adversarial object made the host vehicle escape detection by each model type more than 50% of the time. The dense RGB input contributed more to the success of the adversarial attacks on both cascaded and fusion models.
Comment: arXiv admin note: text overlap with arXiv:2101.10747 Updates in v2: Expanded conclusion and future work, reduced Figure 5's size, and a small correction in Table 3
Databáze: arXiv