Poster
DeClotH: Decomposable 3D Cloth and Human Body Reconstruction from a Single Image
Hyeongjin Nam · Donghwan Kim · Jeongtaek Oh · Kyoung Mu Lee
Most existing methods of 3D clothed human reconstruction from a single image consider the clothed human as a single object without separating them. In this work, we present DeClotH, a novel framework that separately reconstructs 3D cloth and human body from a single image. This task has not yet been explored because of the extreme occlusion between cloth and the human body, which makes it complicated to infer the overall geometries and textures. Furthermore, while recent 3D human reconstruction methods show impressive results by leveraging text-to-image diffusion models, naively adopting such a strategy to this problem often provides incorrect guidance, especially in reconstructing 3D cloth. To address these challenges, we propose two core designs in our framework. First, to alleviate the occlusion issue, we leverage 3D template models of both cloth and human body as regularizations, which provide strong priors and prevent erroneous reconstruction by the occlusion. Second, we leverage a cloth diffusion model specifically devised to provide contextual information about cloth appearances for reconstructing 3D cloth. Qualitative and quantitative experiments demonstrate that our proposed approaches are highly effective in reconstructing 3D cloth and the human body. The codes will be released.
Live content is unavailable. Log in and register to view live content