Skip to yearly menu bar Skip to main content


Poster

GBC: Generalizable Gaussian-Based Clothed Human Digitalization

Hanzhang Tu · Zhanfeng Liao · Boyao Zhou · Shunyuan Zheng · Xilong Zhou · Liuxin ZHANG · QianYing Wang · Yebin Liu


Abstract:

We present an efficient approach for generalizable clothed human digitalization. Unlike previous methods that necessitate subject-wise optimizations or discount watertight geometry, the proposed method is dedicated to reconstruct complete human shape and Gaussian Splatting via sparse view RGB input. We extract fine-grained mesh by the combination of implicit occupancy field regression and explicit disparity estimation between views. The reconstructed high-quality geometry allows us to easily anchor Gaussian primitives according to surface normal and texture, which allows 6-DoF photorealistic novel view synthesis. Further, we introduce a simple yet effective algorithm to split Gaussian primitives in high-frequency area to enhance the visual quality. Without the assistance of templates like SMPL, our method can tackle loose clothing like dresses and costumes. To this end, we train our reconstruction pipeline on a large amount of human scan data, to achieve generalization capability across datasets. Our method outperforms recent methods in terms of novel view synthesis, while keeping high-efficiency, enabling the potential of deployment in real-time applications.

Live content is unavailable. Log in and register to view live content