Decouple Your Discovery and Memory in Continual Generalized Category Discovery
Abstract
Continual Generalized Category Discovery (C-GCD) seeks to incrementally discover new categories from unlabeled data and memorize old categories’ knowledge, fostering model adaptability in real-world scenarios. Especially, the unlabeled data is from both old and new classes, requiring the model to recognize previously learned classes while discovering. In response, recent efforts focus on devising specific frameworks and various anti-forgetting strategies, striving for a typical stability-plasticity trade-off. Unlike previous studies, in this work, we first revisit these methods and identify that most of these methods over-protect old classes, hampering the accurate discovery of novel ones. To address this challenge, we introduce the Decouple Your Discovery and Memory (DYDM), a dual-branch architecture that decouples the discovery of new classes and the memorization of old classes. The discovery branch is focused on accurately recognizing new classes, while the memory branch consolidates all identified categories in a recursive manner and functions as the inference branch. Importantly, benefiting from the strong knowledge retention ability of the memory branch, the discovery branch can facilitate the recognition of novel classes from the unlabeled data, achieving a win-win outcome between plasticity and stability. Extensive experiments on various datasets and settings demonstrate the superiority of our approach, achieving leads of up to 9.87%, 7.30%, 3.18%, and 8.25%. Furthermore, our framework can integrate with existing approaches, consistently enhancing their performance.