Attention-based 3D Object Reconstruction from a Single Image
Autor: | Eduardo Pooch, Felipe Tasoniero, Andrey De Aguiar Salvi, Nathan Gavenski, Rodrigo C. Barros |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
business.industry Computer science Computer Vision and Pattern Recognition (cs.CV) 05 social sciences Feature extraction 3D reconstruction Computer Science - Computer Vision and Pattern Recognition Iterative reconstruction 010501 environmental sciences Virtual reality 01 natural sciences Convolutional neural network Feature (computer vision) 0502 economics and business Polygon mesh Augmented reality Computer vision Artificial intelligence 050207 economics business Encoder 0105 earth and related environmental sciences |
Zdroj: | IJCNN |
Popis: | Recently, learning-based approaches for 3D reconstruction from 2D images have gained popularity due to its modern applications, e.g., 3D printers, autonomous robots, self-driving cars, virtual reality, and augmented reality. The computer vision community has applied a great effort in developing functions to reconstruct the full 3D geometry of objects and scenes. However, to extract image features, they rely on convolutional neural networks, which are ineffective in capturing long-range dependencies. In this paper, we propose to substantially improve Occupancy Networks, a state-of-the-art method for 3D object reconstruction. For such we apply the concept of self-attention within the network's encoder in order to leverage complementary input features rather than those based on local regions, helping the encoder to extract global information. With our approach, we were capable of improving the original work in 5.05% of mesh IoU, 0.83% of Normal Consistency, and more than 10X the Chamfer-L1 distance. We also perform a qualitative study that shows that our approach was able to generate much more consistent meshes, confirming its increased generalization power over the current state-of-the-art. 8 pages, 4 figures, 3 tables |
Databáze: | OpenAIRE |
Externí odkaz: |