Publication Type

Journal Article

Version

submittedVersion

Publication Date

8-2025

Abstract

Model-based class incremental learning (CIL) methods aim to address the challenge of catastrophic forgetting by retaining certain parameters and expanding the model architecture. However, retaining too many parameters can lead to an overly complex model, increasing inference overhead. Additionally, compressing these parameters to reduce the model size can result in performance degradation. To tackle these challenges, we propose a novel three-stage CIL framework called Localized and Layered Reparameterization for Incremental Learning (L3Net). The rationale behind our approach is to balance model complexity and performance by selectively expanding and optimizing critical components. Specifically, the framework introduces a Localized Dual-path Expansion structure, which allows the model to learn simultaneously from both old and new features by integrating a fusion selector after each convolutional layer. To further minimize potential conflicts between old and new features, we implement the Feature Selectors Gradient Resetting method, which sparsifies the fusion selectors and reduces the influence of redundant old features. Additionally, to address classification bias resulting from class imbalance, we design the Decoupled Balanced Distillation technique and apply Logit Adjustment to more effectively retain knowledge from the rehearsal set. Extensive experiments demonstrate that our L3Net framework outperforms state-of-the-art methods on widely used benchmarks, including CIFAR-100 and ImageNet-100/1000.

Keywords

Class incremental learning, Knowledge distillation, Reparameterization

Discipline

Artificial Intelligence and Robotics | Theory and Algorithms

Areas of Excellence

Digital transformation

Publication

Neural Networks

Volume

188

First Page

1

Last Page

14

ISSN

0893-6080

Identifier

10.1016/j.neunet.2025.107420

Publisher

Elsevier

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1016/j.neunet.2025.107420

Share

COinS