UniPixie: Unified and Probabilistic 3D Physics Learning via Flow Matching
Abstract
Recent progress in 3D reconstruction, such as NeRFs and 3D Gaussian Splatting, has made it easy to recover geometry and appearance from images. However, these static representations remain blind to the physics that govern how objects deform and respond to forces. Building interactive 3D worlds therefore requires predicting not only shape but the underlying material properties. Prior approaches either rely on slow test-time optimization or, more recently, a fast feed-forward predictor such as Pixie. However, these models produce only a single point estimate of physical parameters and are limited to a single simulation backend, restricting both expressiveness and portability. We introduce UniPixie, a generative physics-from-pixels framework that overcomes both limitations. UniPixie predicts a controllable, continuous soft-to-stiff distribution of plausible material properties from a single visual input, capturing inherent physical ambiguity. In addition, UniPixie is the first unified architecture to generate simulation-ready parameters for multiple physics solvers, including Material Point Method (MPM), Linear Blend Skinning (LBS), and Spring-Mass systems. Trained on our new PIXIEMULTIVERSE dataset of annotated material ranges, UniPixie produces diverse, physically consistent dynamics and achieves state-of-the-art accuracy, outperforming deterministic baselines by over 2x while inheriting the fast and generalizable inference from the prior feed-forward work.