Skip to yearly menu bar Skip to main content


SplattingAvatar: Realistic Real-Time Human Avatars with Mesh-Embedded Gaussian Splatting

Zhijing Shao · Wang Zhaolong · Zhuang Li · Duotun Wang · Xiangru Lin · Yu Zhang · Mingming Fan · Zeyu Wang

Arch 4A-E Poster #140
[ ] [ Project Page ]
Wed 19 Jun 10:30 a.m. PDT — noon PDT


We present SplattingAvatar, a hybrid 3D representation of photorealistic human avatars with Gaussian Splatting embedded on a triangle mesh, which renders over 300 FPS on a modern GPU and 30 FPS on a mobile device.We disentangle the motion and appearance of a virtual human with explicit mesh geometry and implicit Gaussian Splatting rendering. The Gaussians are defined by barycentric coordinates and displacement on a triangle mesh as Phong surfaces. We extend lifted optimization to simultaneously optimize the parameters of the Gaussians while walking on the triangle mesh.SplattingAvatar is a hybrid representation of virtual humans where the mesh represents low-frequency motion and surface deformation, while the Gaussians take over the high-frequency geometry and detailed appearance.Unlike existing deformation methods that rely on an MLP-based linear blend skinning (LBS) field for motion, we control the rotation and translation of the Gaussians directly by mesh, which empowers its compatibility with various animation techniques, e.g., skeletal animation, blend shapes, and mesh editing.% Our method can be trained from monocular videos for full-body or head avatars. We demonstrate state-of-the-art quality and real-time rendering on several datasets. We plan to release our source code in the hope of facilitating research on digital humans.Trainable from monocular videos for both full-body and head avatars, SplattingAvatar shows state-of-the-art rendering quality across multiple datasets. We plan to release our source code to support further research in digital human modeling.

Live content is unavailable. Log in and register to view live content