Poster
Attraction Diminishing and Distributing for Few-Shot Class-Incremental Learning
Li-Jun Zhao · Zhen-Duo Chen · Yongxin Wang · Xin Luo · Xin-Shun Xu
Few-Shot Class-Incremental Learning (FSCIL) aims to continuously learn novel classes with limited samples after pre-training on a set of base classes. To avoid catastrophic forgetting and overfitting, most FSCIL methods first train the model on the base classes and then freeze the feature extractor in the incremental sessions. However, the reliance on nearest neighbor classification makes FSCIL prone to the hubness phenomenon, which negatively impacts performance in this dynamic and open scenario. While recent methods attempt to adapt to the dynamic and open nature of FSCIL, they are often limited to biased optimizations to the feature space. In this paper, we pioneer the theoretical analysis of the inherent hubness in FSCIL. To mitigate the negative effects of hubness, we propose a novel Attraction Diminishing and Distributing (D2A) method from the essential perspectives of distance metric and feature space. Extensive experimental results demonstrate that our method can broadly and significantly improve the performance of existing methods.
Live content is unavailable. Log in and register to view live content