Specular BSDF Approximation for Efficient Specular Scene Rendering
Autor: | Bouchard, Guillaume, Iehl, Jean-Claude, Ostromoukhov, Victor, Péroche, Bernard, Albin, Stéphane, Guenegou, Romain, Uson, Carmen |
---|---|
Přispěvatelé: | Rendu Réaliste pour la Réalité Augmentée Mobile (R3AM), Laboratoire d'InfoRmatique en Image et Systèmes d'information (LIRIS), Institut National des Sciences Appliquées de Lyon (INSA Lyon), Université de Lyon-Institut National des Sciences Appliquées (INSA)-Université de Lyon-Institut National des Sciences Appliquées (INSA)-Centre National de la Recherche Scientifique (CNRS)-Université Claude Bernard Lyon 1 (UCBL), Université de Lyon-École Centrale de Lyon (ECL), Université de Lyon-Université Lumière - Lyon 2 (UL2)-Institut National des Sciences Appliquées de Lyon (INSA Lyon), Université de Lyon-Université Lumière - Lyon 2 (UL2), SI LIRIS, Équipe gestionnaire des publications |
Jazyk: | angličtina |
Rok vydání: | 2012 |
Předmět: | |
Zdroj: | International Light Simulation Symposium 2012 International Light Simulation Symposium 2012, Mar 2012, Nuremberg, Germany. pp.217-232 |
Popis: | International audience; We propose a simple and robust adaptive specular BSDF evaluation algorithm based of stochastic progressive photon mapping. This algorithm can handle scenes that are considered difficult for most of current simulation approaches which deal with mixed diffuse and specular objects. By contrast, our approach can handle highly specular scene such as car lamps and light guide. The proposed method is simple to implement, needs very small memory for data structures and the resulting image does not depend on parameters. The method can produce bias- and noise-free images.The contribution of this paper is in two folds. First, we propose a simple and straightforward method for estimation of light transport of highly specular scenes. Then, we demonstrate an efficient approximation of the exact method by a real-time GPU-based implementation.As with progressive photon mapping, the algorithm repeats two passes, the eye-pass and the photon-pass. The first pass traces rays from the observer through the scene and stores a hit-point on the first surface hit. This hit-point is associated with a search radius and the BSDF of the surface. On the second pass, photons are traced from the light sources through the scene and their energy is splatted at each hit-point. The contributed energy for each hit-point depends on its associated BSDF and gathering radius.At the end of each eye-pass the gathering radius of hit-points is reduced. This ensures that the error associated with the light density estimation associated with each hit-point will diminish and that it will converge to the correct value.In our method, the rendering starts with a near-diffuse glossy BSDF and, through passes, the glossiness is raised to converge to a highly glossy BSDF which will converge to a specular BSDF in the limit.This method behaves exactly as progressive photon mapping regarding convergence properties. The trade-off between initial bias and variance is controlled by initial gathering radius. The amount of variance that unbiased method gives on this kind of scene is replaced by initial bias which will converge to zero as more photons are used in the approximation.Because the method relies on storing the hit-points on the first intersected surface, we can do the gathering step in screen-space which perfectly fits on GPU device. If bias-free results are not needed, the algorithm becomes point of view-independent and therefore suitable for real-time visualization of scene with quality only limited by the amount of photons the GPU device can store. |
Databáze: | OpenAIRE |
Externí odkaz: |