Skip to yearly menu bar Skip to main content


Any-Shift Prompting for Generalization over Distributions

Zehao Xiao · Jiayi Shen · Mohammad Mahdi Derakhshani · Shengcai Liao · Cees G. M. Snoek

Arch 4A-E Poster #406
[ ]
Thu 20 Jun 10:30 a.m. PDT — noon PDT


Image-language models with prompt learning have shown remarkable advances in numerous downstream vision tasks. Nevertheless, conventional prompt learning methods overfit the training distribution and lose the generalization ability on the test distributions. To improve the generalization across various distribution shifts, we propose any-shift prompting: a general probabilistic inference framework that considers the relationship between training and test distributions during prompt learning. We explicitly connect training and test distributions in the latent space by constructing training and test prompts in a hierarchical architecture. Within this framework, the test prompt exploits the distribution relationships to guide the generalization of the CLIP image-language model from training to any test distribution. To effectively encode the distribution information and their relationships, we further introduce a transformer inference network with a pseudo-shift training mechanism. The network generates the tailored test prompt with both training and test information in a feedforward pass, avoiding extra training costs at test time. Extensive experiments on twenty-three datasets demonstrate the effectiveness of any-shift prompting on the generalization over various distribution shifts.

Live content is unavailable. Log in and register to view live content