Skip to yearly menu bar Skip to main content


ControlRoom3D: Room Generation using Semantic Proxy Rooms

Jonas Schult · Sam Tsai · Lukas Höllein · Bichen Wu · Jialiang Wang · Chih-Yao Ma · Kunpeng Li · Xiaofang Wang · Felix Wimbauer · Zijian He · Peizhao Zhang · Bastian Leibe · Peter Vajda · Ji Hou

Arch 4A-E Poster #136
[ ] [ Project Page ]
Wed 19 Jun 5 p.m. PDT — 6:30 p.m. PDT


Manually creating 3D environments for AR/VR applications is a complex process requiring expert knowledge in 3D modeling software.Pioneering works facilitate this process by generating room meshes conditioned on textual style descriptions.Yet, many of these automatically generated 3D meshes do not adhere to typical room layouts, compromising their plausibility, e.g., by placing several beds in one bedroom.To address these challenges, we present ControlRoom3D, a novel method to generate high-quality room meshes.Central to our approach is a user-defined 3D semantic proxy room that outlines a rough room layout based on semantic bounding boxes and a textual description of the overall room style.Our key insight is that when rendered to 2D, this 3D representation provides valuable geometric and semantic information to control powerful 2D models to generate 3D consistent textures and geometry that aligns well with the proxy room.Backed up by an extensive study including quantitative metrics and qualitative user evaluations, our method generates diverse and globally plausible 3D room meshes, thus empowering users to design 3D rooms effortlessly without specialized knowledge.

Live content is unavailable. Log in and register to view live content