NeLT: Object-oriented Neural Light Transfer
Autor: | Chuankun Zheng, Yuchi Huo, Shaohua Mo, Zhihua Zhong, Zhizhen Wu, Wei Hua, Rui Wang, Hujun Bao |
---|---|
Rok vydání: | 2023 |
Předmět: | |
Zdroj: | ACM Transactions on Graphics. |
ISSN: | 1557-7368 0730-0301 |
DOI: | 10.1145/3596491 |
Popis: | This paper presents object-oriented neural light transfer (NeLT), a novel neural representation of the dynamic light transportation between an object and the environment. Our method disentangles the global illumination (GI) of a scene into individual objects’ light transportation represented via neural networks, and then composes them explicitly. It, therefore, enables flexible rendering with dynamic lighting, cameras, materials, and objects. Our rendering features various important global illumination effects, such as diffuse illumination, glossy illumination, dynamic shadowing, and indirect illumination, which completes the capability of existing neural object representation. Experiments show that NeLT does not require path tracing or shading results as input but achieves rendering quality comparable to state-of-the-art rendering frameworks, including the recent deep learning-based denoisers. |
Databáze: | OpenAIRE |
Externí odkaz: |