Poster
DeSplat: Decomposed Gaussian Splatting for Distractor-Free Rendering
Yihao Wang · Marcus Klasson · Matias Turkulainen · Shuzhe Wang · Juho Kannala · Arno Solin
Gaussian splatting enables fast novel-view synthesis in static 3D environments. However, reconstructing real-world environments remains challenging as distractors or occluders break the multi-view consistency assumption required for good 3D reconstruction. Most existing methods for unconstrained novel-view synthesis rely on external semantic information from foundation models, which introduces additional computational overhead as pre-processing steps or during optimization. In this work, we propose a novel method that directly separates occluders and static scene elements purely based on volume-rendering of Gaussian primitives. We model occluders in training images as two-dimensional hallucinations embedded in the camera views and apply alpha-blending in a back-to-front manner in which a 3D scene and 2D occluders are explicitly modelled within the alpha-compositing stages. Our approach implicitly separates the scene into distinct static and distractor elements achieving comparable results to prior approaches with only photometric supervision. We demonstrate the effectiveness of our approach on in-the-wild datasets.
Live content is unavailable. Log in and register to view live content