Improving 3D Occupancy Prediction through Class-balancing Loss and Multi-scale Representation

Autor: Chen, Huizhou, Wang, Jiangyi, Li, Yuxin, Zhao, Na, Cheng, Jun, Yang, Xulei
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: 3D environment recognition is essential for autonomous driving systems, as autonomous vehicles require a comprehensive understanding of surrounding scenes. Recently, the predominant approach to define this real-life problem is through 3D occupancy prediction. It attempts to predict the occupancy states and semantic labels for all voxels in 3D space, which enhances the perception capability. Birds-Eye-View(BEV)-based perception has achieved the SOTA performance for this task. Nonetheless, this architecture fails to represent various scales of BEV features. In this paper, inspired by the success of UNet in semantic segmentation tasks, we introduce a novel UNet-like Multi-scale Occupancy Head module to relieve this issue. Furthermore, we propose the class-balancing loss to compensate for rare classes in the dataset. The experimental results on nuScenes 3D occupancy challenge dataset show the superiority of our proposed approach over baseline and SOTA methods.
Comment: 5 pages, 3 figures, accepted by IEEE CAI 2024
Databáze: arXiv