Skip to yearly menu bar Skip to main content


Bayesian Diffusion Models for 3D Shape Reconstruction

Haiyang Xu · Yu lei · Zeyuan Chen · Xiang Zhang · Yue Zhao · Yilin Wang · Zhuowen Tu

Arch 4A-E Poster #94
[ ]
Thu 20 Jun 10:30 a.m. PDT — noon PDT


We present Bayesian Diffusion Models (BDM), a prediction algorithm that performs effective Bayesian inference by tightly coupling the top-down (prior) information with the bottom-up (data-driven) procedure via joint diffusion processes. We demonstrate the application of BDM on the 3D shape reconstruction task. Compared to standard deep learning data-driven approaches relying on supervised data, our BDM can bring in rich prior information trained in an unsupervised manner to improve the bottom-up 3D reconstruction. As opposed to the traditional Bayesian frameworks where explicitly learned prior and data-driven distributions are required for gradient computation and combination, BDM performs a seamless fusion of the two via coupled diffusion processes with learned gradient computation networks. The specialty of our Bayesian Diffusion Models (BDM) lies in its capability to engage the active and effective information exchange and fusion of the top-down and bottom-up processes where each itself is a diffusion process. We demonstrate state-of-the-art results on both synthetic and real-world benchmarks for 3D shape reconstruction.

Live content is unavailable. Log in and register to view live content