Poster
Neural Inverse Rendering from Propagating Light
Anagh Malik · Benjamin Attal · Andrew Xie · Matthew O’Toole · David B. Lindell
We present the first system for physically based, neural inverse rendering from multi-viewpoint videos of propagating light. Our approach relies on a time-resolved extension of neural radiance caching --- a technique that accelerates inverse rendering by storing infinite-bounce radiance arriving at any point from any direction. The resulting model accurately accounts for direct and indirect light transport effects and, when applied to captured measurements from a flash lidar system, enables state-of-the-art 3D reconstruction in the presence of strong indirect light. Further, we demonstrate view synthesis of propagating light, automatic decomposition of captured measurements into direct and indirect components, as well as novel capabilities such as multi-view transient relighting of captured scenes.
Live content is unavailable. Log in and register to view live content