PanoSSC: Exploring Monocular Panoptic 3D Scene Reconstruction for Autonomous Driving

Autor: Shi, Yining, Li, Jiusi, Jiang, Kun, Wang, Ke, Wang, Yunlong, Yang, Mengmeng, Yang, Diange
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
DOI: 10.1109/3DV62453.2024.00104
Popis: Vision-centric occupancy networks, which represent the surrounding environment with uniform voxels with semantics, have become a new trend for safe driving of camera-only autonomous driving perception systems, as they are able to detect obstacles regardless of their shape and occlusion. Modern occupancy networks mainly focus on reconstructing visible voxels from object surfaces with voxel-wise semantic prediction. Usually, they suffer from inconsistent predictions of one object and mixed predictions for adjacent objects. These confusions may harm the safety of downstream planning modules. To this end, we investigate panoptic segmentation on 3D voxel scenarios and propose an instance-aware occupancy network, PanoSSC. We predict foreground objects and backgrounds separately and merge both in post-processing. For foreground instance grouping, we propose a novel 3D instance mask decoder that can efficiently extract individual objects. we unify geometric reconstruction, 3D semantic segmentation, and 3D instance segmentation into PanoSSC framework and propose new metrics for evaluating panoptic voxels. Extensive experiments show that our method achieves competitive results on SemanticKITTI semantic scene completion benchmark.
Comment: 3dv2024
Databáze: arXiv