Explorations for real-time point cloud rendering of natural scenes in virtual reality
Autor: | Gauthier Lafruit, Rudy Ercek, Arnaud Schenkel, Daniele Bonatto, Segolene Rogge |
---|---|
Přispěvatelé: | Electronics and Informatics, Faculty of Engineering |
Jazyk: | angličtina |
Rok vydání: | 2016 |
Předmět: |
Parallel rendering
business.industry Computer science 010401 analytical chemistry ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION Software rendering 020207 software engineering 02 engineering and technology Virtual reality Image-based modeling and rendering 01 natural sciences Real-time rendering 3D rendering 0104 chemical sciences Rendering (computer graphics) Viewing frustum Computer graphics (images) 0202 electrical engineering electronic engineering information engineering Computer vision Artificial intelligence business ComputingMethodologies_COMPUTERGRAPHICS |
Zdroj: | IC3D |
Popis: | This work is a proof of concept, where we explore the possibility of rendering natural scenes in a head mounted display device without meshing. Real-time, stereoscopic full-HD rendering is obtained for a 14 million points scene, using a low end graphics card for virtual reality (Nvidia GeForce GTX 970), within an Oculus Rift DK2. High quality is achieved by using splatting, while real-time rendering is made possible by the means of a good data structure and a complexity reduction of the scene with techniques such as Optimized Sub-Sampling, Level of Detail and Frustum Culling. Altogether, those techniques lead to a good virtual reality immersion. Choices and limitations of the proposed techniques are discussed. |
Databáze: | OpenAIRE |
Externí odkaz: |