Poster
Online Task-Free Continual Learning via Dynamic Expansionable Memory Distribution
Fei Ye ยท Adrian Bors
Recent continuous learning (CL) research primarily addresses catastrophic forgetting within a straightforward learning framework where class and task information are predefined. However, in more realistic and challenging CL scenarios, such supervised information is typically absent. In this paper, we address this challenging CL scenario by introducing an innovative memory management approach, by incorporating a dynamic memory system for storing selected representatives from evolving data while a dynamically expandable memory system enables retaining essential long-term knowledge. Specifically, the dynamic expandable memory system manages a series of memory distributions, each designed to represent the information from a distinct data category. We propose a new memory expansion mechanism that assesses the proximity between incoming samples and existing memory distributions, utilizing this evaluation to incrementally add new memory distributions into the system. Additionally, a novel memory distribution augmentation technique is proposed for selectively gathering suitable samples for each memory distribution, enhancing the statistical robustness over time. To prevent memory saturation before the training phase, we introduce a memory distribution reduction strategy that automatically eliminates overlapping memory distributions, ensuring adequate capacity for accommodating new information in subsequent learning episodes. We conduct a series of experiments demonstrating that our proposed approach attains state-of-the-art performance in both supervised and unsupervised learning contexts.
Live content is unavailable. Log in and register to view live content