Publication Type

Journal Article

Version

acceptedVersion

Publication Date

1-2024

Abstract

According to the Complementary Learning Systems (CLS) theory (McClelland et al. 1995) in neuroscience, humans do effective continual learning through two complementary systems: a fast learning system centered on the hippocampus for rapid learning of the specifics, individual experiences; and a slow learning system located in the neocortex for the gradual acquisition of structured knowledge about the environment. Motivated by this theory, we propose DualNets (for Dual Networks), a general continual learning framework comprising a fast learning system for supervised learning of pattern-separated representation from specific tasks and a slow learning system for representation learning of task-agnostic general representation via Self-Supervised Learning (SSL). DualNets can seamlessly incorporate both representation types into a holistic framework to facilitate better continual learning in deep neural networks. Via extensive experiments, we demonstrate the promising results of DualNets on a wide range of continual learning protocols, ranging from the standard offline, task-aware setting to the challenging online, task-free scenario. Notably, on the CTrL (Veniat et al. 2020) benchmark that has unrelated tasks with vastly different visual images, DualNets can achieve competitive performance with existing state-of-the-art dynamic architecture strategies (Ostapenko et al. 2021). Furthermore, we conduct comprehensive ablation studies to validate DualNets efficacy, robustness, and scalability.

Keywords

Continual learning, fast and slow learning

Discipline

Artificial Intelligence and Robotics | Theory and Algorithms

Research Areas

Data Science and Engineering

Publication

IEEE Transactions on Pattern Analysis and Machine Intelligence

Volume

46

Issue

1

First Page

134

Last Page

149

ISSN

0162-8828

Identifier

10.1109/TPAMI.2023.3324203

Publisher

Institute of Electrical and Electronics Engineers

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1109/TPAMI.2023.3324203

Share

COinS