Skip to yearly menu bar Skip to main content


Poster

SEC-Prompt:SEmantic Complementary Prompting for Few-Shot Class-Incremental Learning

Ye Liu ยท Meng Yang


Abstract:

Few-shot class-incremental learning (FSCIL) presents a significant challenge in machine learning, requiring models to integrate new classes from limited examples while preserving performance on previously learned classes. Recently, prompt-based CIL approaches leverage ample data to train prompts, effectively mitigating catastrophic forgetting. However, these methods do not account for the semantic features embedded in prompts, exacerbating the plasticity-stability dilemma in few-shot incremental learning. In this paper, we propose a novel and simple framework named SEmantic Complementary Prompt(SEC-Prompt), which learns two sets of semantically complementary prompts based on an adaptive query: discriminative prompts(D-Prompt) and non-discriminative prompts(ND-Prompt). D-Prompt enhances the separation of class-specific feature distributions by strengthening key discriminative features, while ND-Prompt balances non-discriminative information to promote generalization to novel classes. To efficiently learn high-quality knowledge from limited samples, we leverage ND-Prompt for data augmentation to increase sample diversity and introduce Prompt Clustering Loss to prevent noise contamination in D-Prompt, ensuring robust discriminative feature learning and improved generalization. Our experimental results showcase state-of-the-art performance across four benchmark datasets, including CIFAR100, ImageNet-R and CUB datasets.

Live content is unavailable. Log in and register to view live content