Beyond the Static World: Continual Category Discovery under Visual Drift
Abstract
Generalized Category Discovery (GCD) aims to identify both known and novel classes from unlabeled data with the aid of labeled examples. While promising, most existing GCD methods rely on simultaneous access to labeled and unlabeled datasets—an assumption often impractical in real-world deployments. Continual Category Discovery (CCD) relaxes this requirement by adapting a pre-trained model to streaming unlabeled data, yet it typically assumes domain-consistent data distributions. This places a strong limitation on its applicability. In this work, we study Open Continual Category Discovery (OCCD), where the model must robustly discover previously unseen concepts from real-world data streams that may originate from heterogeneous and shifting domains. To address this, we propose an adaptive framework built on three key ideas. First, we propose a weight-aware separation module, which leverages partial unbalanced optimal transport for instance probability modeling and employs binary response spectrum quantization to generate cues for distinguishing known and unknown categories, enabling automatic sample separation. Second, for known categories, we introduce a cross-domain semantic alignment module that incorporates adversarial learning to perform adaptive prototype matching, thereby enhancing robustness against domain shifts. Finally, for unknown categories, we design a category topology consistency constraint that preserves semantic relationships between known and novel classes during distribution shifts. Experiments show our approach excels at discovering new categories while maintaining strong performance on known ones in evolving domains.