Factorized Motion Fields for Fast Sparse Input Dynamic View Synthesis

Autor: Somraj, Nagabhushan, Choudhary, Kapil, Mupparaju, Sai Harsha, Soundararajan, Rajiv
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
DOI: 10.1145/3641519.3657498
Popis: Designing a 3D representation of a dynamic scene for fast optimization and rendering is a challenging task. While recent explicit representations enable fast learning and rendering of dynamic radiance fields, they require a dense set of input viewpoints. In this work, we focus on learning a fast representation for dynamic radiance fields with sparse input viewpoints. However, the optimization with sparse input is under-constrained and necessitates the use of motion priors to constrain the learning. Existing fast dynamic scene models do not explicitly model the motion, making them difficult to be constrained with motion priors. We design an explicit motion model as a factorized 4D representation that is fast and can exploit the spatio-temporal correlation of the motion field. We then introduce reliable flow priors including a combination of sparse flow priors across cameras and dense flow priors within cameras to regularize our motion model. Our model is fast, compact and achieves very good performance on popular multi-view dynamic scene datasets with sparse input viewpoints. The source code for our model can be found on our project page: https://nagabhushansn95.github.io/publications/2024/RF-DeRF.html.
Comment: Accepted at SIGGRAPH 2024
Databáze: arXiv