Few-Shot Hybrid Incremental Learning:Continually Learning under Data Scarcity and Task Uncertainty
Abstract
The increasing complexity of real-world deployment requires intelligent agents to effectively adapt to non-stationary data streams with stochastic increments under data scarcity. We formally define this challenge as the Few-Shot Hybrid Incremental Learning (FSHIL) paradigm, which reveals a critical stability-plasticity dilemma. Existing strategies struggle to address this dilemma: representation freezing in few-shot incremental learning can mitigate overfitting under data scarcity but leads to insufficient representation plasticity, while architecture expansion in hybrid incremental learning provides plasticity for adaptation but results in overfitting under few-shot conditions. To address this, we propose the Conditional Meta-Expanding Mixture-of-Experts (CME-MoE), which balances feature-level stability-plasticity trade-off through conditional expert reuse and meta-expansion mechanism. Furthermore, recognizing the multi-domain manifestation in the latent space, we introduce the Self-Expanding Prototype Classifier (SEPC), which on-demand expands classification to model complex domain-shifted decision boundaries. The proposed method outperforms existing state-of-the-art methods in three few-shot incremental learning settings across five mainstream datasets, effectively addressing data scarcity and task uncertainty, and providing a robust solution for real-world continual learning.