Skip to yearly menu bar Skip to main content


Poster

Ref-GS: Modeling View-Dependent Appearance with Environment Gaussian

Tao Xie · Xi Chen · Zhen Xu · Yiman Xie · Yudong Jin · Yujun Shen · Sida Peng · Hujun Bao · Xiaowei Zhou


Abstract:

Reconstructing complex reflections in real-world scenes from 2D images is essential for achieving photorealistic novel view synthesis. Existing methods that utilize environment maps to model reflections from distant lighting often struggle with high-frequency reflection details and fail to account for near-field reflections. In this work, we introduce RefGS, a novel environment representation that employs a set of Gaussian primitives as an explicit 3D model for capturing reflections. Our approach models all reflections, regardless of their distance, into a unified set of Gaussian primitives, effectively representing high-frequency reflections from both near and distant light sources. To efficiently render these environment Gaussians, we developed a ray-tracing-based renderer that leverages the GPU's RT core for fast rendering. This allows us to jointly optimize our model for high-quality reconstruction while maintaining real-time rendering speeds. Results from multiple real-world and synthetic datasets demonstrate that our method produces significantly more detailed reflections, achieving the best rendering quality in real-time novel view synthesis. The code will be released upon acceptance.

Live content is unavailable. Log in and register to view live content