Poster
Splatter-360: Generalizable 360 Gaussian Splatting for Wide-baseline Panoramic Images
Zheng Chen · Chenming Wu · Zhelun Shen · Chen Zhao · Weicai Ye · Haocheng Feng · Errui Ding · Song-Hai Zhang
Wide-baseline panoramic images are commonly used in applications such as VR and simulation rendering to reduce network bandwidth and storage requirements. However, synthesizing novel views from these panoramic images in real time remains a significant challenge, especially due to the high resolution and inherent distortions of panoramic imagery. Although existing 3D Gaussian splatting (3DGS) methods can produce photo-realistic views under narrow baselines, they often overfit the training views when dealing with wide-baseline panoramic images due to the difficulty in learning precise geometry from sparse 360-degree views. This paper presents Splatter-360, a novel end-to-end generalizable 3DGS framework specifically designed to handle wide-baseline panoramic images. Unlike previous approaches, Splatter-360 performs multi-view matching directly in the spherical domain by constructing a spherical cost volume through a spherical sweep algorithm, enhancing the network's depth perception and geometry estimation. Additionally, we introduce a 3D-aware bi-projection encoder to mitigate the distortions inherent in panoramic images and integrate cross-view attention to improve feature interactions across multiple viewpoints. This enables robust 3D-aware feature representations and real-time rendering capabilities. Experimental results on the HM3D and Replica demonstrate that Splatter-360 significantly outperforms state-of-the-art NeRF and 3DGS methods (e.g., PanoGRF, MVSplat, DepthSplat, and HiSplat) in both synthesis quality and generalization performance for wide-baseline panoramic images. Source code will be released.
Live content is unavailable. Log in and register to view live content