Elastic Weight Consolidation Done Right for Continual Learning
Abstract
Weight regularization methods in continual learning (CL) alleviate catastrophic forgetting by assessing and penalizing changes to important model weights. Elastic Weight Consolidation (EWC) is a foundational and widely used approach within this framework that estimates weight importance based on gradients.However, it has consistently shown suboptimal performance.In this paper, we conduct a systematic analysis of importance estimation in EWC from a gradient-based perspective.For the first time, we find that EWC’s reliance on the Fisher Information Matrix (FIM) results in gradient vanishing and inaccurate importance estimation in certain scenarios.Our analysis also reveals that Memory Aware Synapses (MAS), a variant of EWC, imposes unnecessary constraints on parameters irrelevant to prior tasks, termed the redundant protection.Consequently, both EWC and its variant exhibit fundamental misalignments in estimating the importance of weights, leading to inferior performance.To tackle these issues, we propose the Logits Reversal (LR) operation, a simple yet effective modification that rectifies the importance estimation of EWC.Specifically, reversing the logit values during the calculation of the FIM can effectively prevent both the gradient vanishing and the redundant protection.Extensive experiments across various CL tasks and datasets show that the proposed method significantly outperforms existing EWC and its variants. Therefore, we refer to it as EWC Done Right (EWC-DR).