Opti-NeuS: Neural Reconstruction for Dual-Layered Transparent and Opaque Objects
Abstract
3D reconstruction of transparent objects from multiple views has been a long-standing challenge. In contrast to opaque objects, transparent objects exhibit complex refraction that causes serious image distortion, resulting in a highly ill-posed problem. Existing reconstruction methods commonly depend on special capture devices or controlled environments, which provide more priors and simplify the modeling of refraction. More importantly, these methods lack the capability for reconstruction of mixed transparent and opaque objects, being confined to transparent or opaque materials. To address these challenges, we propose Opti-NeuS, a novel method for reconstructing transparent and opaque objects without controlled environments or additional input. Opti-NeuS incorporates a novel IoRNetwork to obtain spatially-varying IoR for tracing the refractive ray paths, which can finally model refractive visual distortion. To deal with dual-layered transparent and opaque objects, we devise a two-stage hierarchical reconstruction strategy that decouples outer and inner geometry, combined with alpha-blending for transparency-aware surface separation. Experiments show that Opti-NeuS achieves practical utility and effectiveness and outperforms prior works.