Skip to yearly menu bar Skip to main content


Online Task-Free Continual Generative and Discriminative Learning via Dynamic Cluster Memory

飞 叶 · Adrian Bors

Arch 4A-E Poster #200
[ ]
Fri 21 Jun 5 p.m. PDT — 6:30 p.m. PDT


Online Task-Free Continual Learning (OTFCL) aims to learn novel concepts from streaming data without accessing task information. The memory-based approaches have shown remarkable results in OTFCL, but most require accessing supervised signals to implement their sample selection mechanism, limiting their applicability in unsupervised learning. In this study, we address this issue by proposing a novel memory management approach, Dynamic Cluster Memory (DCM), which adaptively builds new memory clusters to capture distribution shifts over time without accessing supervised signals. Specifically, the DCM introduces a novel memory expansion mechanism based on a knowledge discrepancy measure criterion, which evaluates the novelty of the incoming data as the signal for the memory expansion, ensuring a compact memory capacity. Additionally, we propose a new sample selection approach that automatically stores incoming data samples with similar semantic information in the same memory cluster, facilitating knowledge diversity among memory clusters. Furthermore, a novel memory pruning approach is proposed to automatically remove information overlapping memory clusters through a graph relation evaluation, ensuring a fixed memory capacity while maintaining diversity among the samples stored in the memory. The proposed DCM is model-free, plug-and-play, and can be performed in both supervised and unsupervised learning without any modifications. Empirical results on OTFCL experiments show that the proposed DCM outperforms the state-of-the-art while memorizing fewer data samples.

Live content is unavailable. Log in and register to view live content