Skip to yearly menu bar Skip to main content


Poster

Free360: Layered Gaussian Splatting for Unbounded 360-Degree View Synthesis from Extremely Sparse and Unposed Views

Chong Bao · Xiyu Zhang · Zehao Yu · Jiale Shi · Guofeng Zhang · Songyou Peng · Zhaopeng Cui


Abstract:

3D Gaussian Splatting (3DGS) has demonstrated remarkable success in high-quality 3D neural reconstruction and novel view rendering with dense input views and accurate poses. However, applying 3DGS to sparse, unposed views in unbounded 360-degree scenes remains a challenging problem. In this paper, we propose a novel neural rendering framework to accomplish the unposed and extremely sparse-view 3D reconstruction in unbounded 360-degree scenes.To resolve the spatial ambiguity inherent in unbounded scenes with sparse input views, we propose a layered Gaussian-based representation to effectively model the scene with distinct spatial layers.By employing a dense stereo reconstruction model to recover coarse geometry, we introduce a reconstruction bootstrap optimization that refines the noise and distortions in the coarse geometry.Furthermore, we propose an iterative fusion of reconstruction and generation, facilitating mutual conditioning and enhancement between these two processes.Comprehensive experiments show that our approach outperforms existing state-of-the-art methods in terms of rendering quality and surface reconstruction accuracy.

Live content is unavailable. Log in and register to view live content