Monocular Depth Prediction through Continuous 3D Loss
Autor: | Zhong Cao, Pingping Lu, Maani Ghaffari, Ryan M. Eustice, Yuanxin Zhong, Minghan Zhu, Huei Peng |
---|---|
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
0209 industrial biotechnology Ground truth Monocular Stereo cameras Computer science business.industry Computer Vision and Pattern Recognition (cs.CV) Point cloud Computer Science - Computer Vision and Pattern Recognition ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION 02 engineering and technology Function (mathematics) 010501 environmental sciences 01 natural sciences 020901 industrial engineering & automation Lidar Range (statistics) Leverage (statistics) Computer vision Artificial intelligence business 0105 earth and related environmental sciences |
Zdroj: | IROS |
DOI: | 10.48550/arxiv.2003.09763 |
Popis: | This paper reports a new continuous 3D loss function for learning depth from monocular images. The dense depth prediction from a monocular image is supervised using sparse LIDAR points, which enables us to leverage available open source datasets with camera-LIDAR sensor suites during training. Currently, accurate and affordable range sensor is not readily available. Stereo cameras and LIDARs measure depth either inaccurately or sparsely/costly. In contrast to the current point-to-point loss evaluation approach, the proposed 3D loss treats point clouds as continuous objects; therefore, it compensates for the lack of dense ground truth depth due to LIDAR's sparsity measurements. We applied the proposed loss in three state-of-the-art monocular depth prediction approaches DORN, BTS, and Monodepth2. Experimental evaluation shows that the proposed loss improves the depth prediction accuracy and produces point-clouds with more consistent 3D geometric structures compared with all tested baselines, implying the benefit of the proposed loss on general depth prediction networks. A video demo of this work is available at https://youtu.be/5HL8BjSAY4Y. Comment: 8 pages, 4 figures. Accepted by IROS 2020 |
Databáze: | OpenAIRE |
Externí odkaz: |