Knowledge distillation of multi-scale dense prediction transformer for self-supervised depth estimation

Autor: Jimin Song, Sang Jun Lee
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: Scientific Reports, Vol 13, Iss 1, Pp 1-13 (2023)
Druh dokumentu: article
ISSN: 2045-2322
DOI: 10.1038/s41598-023-46178-w
Popis: Abstract Depth estimation is an inverse projection problem that estimates pixel-level distances from a single image. Although, supervised methods have shown promising results, it has intrinsic limitations in requiring ground truth depth from an external sensor. On the other hand, self-supervised depth estimation relieves the burden for collecting calibrated training data, while there is still a large performance gap between supervised and self-supervised methods. The objective of this study is to reduce the performance gap between the supervised and self-supervised approaches. The loss function of previous self-supervised methods is mainly based on a photometric error, which is indirectly computed from synthesized images using depth and pose estimates. In this paper, we argue that direct depth cue is more effective to train a depth estimation network. To obtain the direct depth cue, we employed a knowledge distillation technique, which is a teacher-student learning framework. The teacher network was trained in a self-supervised manner based on a photometric error, and its predictions were utilized to train a student network. We constructed a multi-scale dense prediction transformer with Monte Carlo dropout, and multi-scale distillation loss was proposed to train the student network based on the ensemble of stochastic estimates. Experiments were conducted on the KITTI and Make3D datasets, and our proposed method achieved the state-of-the-art accuracy in self-supervised depth estimation. Our code is publicly available at https://github.com/ji-min-song/KD-of-MS-DPT .
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje