Skip to yearly menu bar Skip to main content


Poster

Exploring Incompatible Knowledge Transfer in Few-Shot Image Generation

Yunqing Zhao · Chao Du · Milad Abdollahzadeh · Tianyu Pang · Min Lin · Shuicheng Yan · Ngai-Man Cheung

West Building Exhibit Halls ABC 311

Abstract:

Few-shot image generation (FSIG) learns to generate diverse and high-fidelity images from a target domain using a few (e.g., 10) reference samples. Existing FSIG methods select, preserve and transfer prior knowledge from a source generator (pretrained on a related domain) to learn the target generator. In this work, we investigate an underexplored issue in FSIG, dubbed as incompatible knowledge transfer, which would significantly degrade the realisticness of synthetic samples. Empirical observations show that the issue stems from the least significant filters from the source generator. To this end, we propose knowledge truncation to mitigate this issue in FSIG, which is a complementary operation to knowledge preservation and is implemented by a lightweight pruning-based method. Extensive experiments show that knowledge truncation is simple and effective, consistently achieving state-of-the-art performance, including challenging setups where the source and target domains are more distant. Project Page: https://yunqing-me.github.io/RICK.

Chat is not available.