Poster
4DTAM: Non-Rigid Tracking and Mapping via Dynamic Surface Gaussians
Hidenobu Matsuki · Gwangbin Bae · Andrew J. Davison
We propose the first tracking and mapping approach for a single RGB-D camera capable of non-rigid surface reconstruction via differentiable rendering. We perform 4D scene capture from an online stream by joint optimization of geometry, appearance, dynamics, and camera ego-motion. Although the natural environment contains complex non-rigid motions, non-rigid SLAM has remained difficult; even with 2.5D sensor measurements, it is still ill-posed due to the high dimensionality of the optimization problem. Our novel SLAM method based on Gaussian surface primitives allows accurate 3D reconstruction and real-time rendering without any template, using a warp-field represented by a multi-layer perceptron (MLP) and regularization terms to enable spatio-temporal reconstruction. A challenge in non-rigid SLAM research is the lack of publicly available datasets with reliable ground truth and standardized evaluation protocols. To address this, we introduce a novel synthetic dataset of everyday objects featuring diverse motions, leveraging availability of large-scale objects and advancements in animation modeling.
Live content is unavailable. Log in and register to view live content