Incremental Object Detection via Future-Aware Decoupled Cross-Head Distillation
Abstract
Incremental Object Detection (IOD) enables AI systems to continuously acquire new object classes while preserving knowledge of previously learned ones, an ability essential for deployment in dynamic, real-world environments. Existing IOD methods typically rely on knowledge distillation to mitigate catastrophic forgetting. However, the tight coupling between the student model’s detection head and backbone causes distillation gradients to conflict with new-class supervision at the head, injecting head-specific bias into the backbone and ultimately weakening distillation effectiveness. To address this issue, we propose a decoupled training mechanism for the model’s backbone and classification head. Specifically, we introduce the Future-aware Cross-head Distillation (FaCHD) method, which utilizes two frozen complementary teachers (historical and intermediate teachers) to decode the student’s ROI features for cross-head distillation. This strategy implicitly alleviates prediction conflicts caused by detection-head bias and provides richer task-relevant guidance, thereby improving distillation efficiency. To further address the detection head bias and model recency problem, we propose a Prototype Semantic Drift Compensation module, which recalibrates multi-granularity prototypes of old classes, effectively correcting semantic drift and enhancing the stability of the detection head. Extensive experiments on two standard IOD benchmarks demonstrate the effectiveness and superiority of the proposed method. Code is available in the supplementary materials.