Skip to yearly menu bar Skip to main content


Poster

AuraFusion360: Augmented Unseen Region Alignment for Reference-based 360° Unbounded Scene Inpainting

Chung-Ho Wu · Yang-Jung Chen · Ying-Huan Chen · Jie-Ying Lee · Bo-Hsu Ke · Chun-Wei Tuan Mu · Yichuan Huang · Chin-Yang Lin · Min-Hung Chen · Yen-Yu Lin · Yu-Lun Liu


Abstract:

Three-dimensional scene inpainting is crucial for applications from virtual reality to architectural visualization, yet existing methods struggle with view consistency and geometric accuracy in 360° unbounded scenes. We present AuraFusion360, a novel reference-based method leveraging 3D Gaussian Splatting for high-quality object removal and hole filling. Our approach introduces (1) depth-aware unseen mask generation for accurate occlusion identification, (2) Adaptive Guided Depth Diffusion for geometric consistency, and (3) SDEdit-based detail enhancement for multi-view coherence. We also introduce 360-USID, the first comprehensive dataset for unbounded scene inpainting with ground truth. Extensive experiments demonstrate that AuraFusion360 significantly outperforms existing methods, achieving superior perceptual quality while maintaining geometric accuracy across dramatic viewpoint changes.

Live content is unavailable. Log in and register to view live content