Skip to yearly menu bar Skip to main content


Poster

Dual-consistency Model Inversion for Non-exemplar Class Incremental Learning

Zihuan Qiu · Yi Xu · Fanman Meng · Hongliang Li · Linfeng Xu · Qingbo Wu


Abstract:

Non-exemplar class incremental learning (NECIL) aims to continuously assimilate new knowledge without forgetting previously acquired ones when historical data are unavailable.One of the generative NECIL methods is to invert the images of old classes for joint training. However, these synthetic images suffer significant domain shifts compared with real data, hampering the recognition of old classes.In this paper, we present a novel method termed Dual-Consistency Model Inversion (DCMI) to generate better synthetic samples of old classes through two pivotal consistency alignments: (1) the semantic consistency between the synthetic images and the corresponding prototypes, and (2) domain consistency between synthetic and real images of new classes.Additionally, we introduce Prototypical Routing (PR) to provide task-prior information and generate unbiased and accurate predictions.Our comprehensive experiments across diverse datasets consistently showcase the superiority of our method over previous state-of-the-art approaches. The code will be released.

Live content is unavailable. Log in and register to view live content