Skip to yearly menu bar Skip to main content


Poster

Blowing in the Wind: CycleNet for Human Cinemagraphs From Still Images

Hugo Bertiche · Niloy J. Mitra · Kuldeep Kulkarni · Chun-Hao P. Huang · Tuanfeng Y. Wang · Meysam Madadi · Sergio Escalera · Duygu Ceylan

West Building Exhibit Halls ABC 044

Abstract:

Cinemagraphs are short looping videos created by adding subtle motions to a static image. This kind of media is popular and engaging. However, automatic generation of cinemagraphs is an underexplored area and current solutions require tedious low-level manual authoring by artists. In this paper, we present an automatic method that allows generating human cinemagraphs from single RGB images. We investigate the problem in the context of dressed humans under the wind. At the core of our method is a novel cyclic neural network that produces looping cinemagraphs for the target loop duration. To circumvent the problem of collecting real data, we demonstrate that it is possible, by working in the image normal space, to learn garment motion dynamics on synthetic data and generalize to real data. We evaluate our method on both synthetic and real data and demonstrate that it is possible to create compelling and plausible cinemagraphs from single RGB images.

Chat is not available.