Sparse Progressive Neural Networks for Continual Learning

Esra Ergün*, Behçet Uğur Töreyin

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

Human brain effectively integrates prior knowledge to new skills by transferring experience across tasks without suffering from catastrophic forgetting. In this study, to continuously learn a visual classification task sequence, we employed a neural network model with lateral connections called Progressive Neural Networks (PNN). We sparsified PNNs with sparse group Least Absolute Shrinkage and Selection Operator (LASSO) and trained conventional PNNs with recursive connections. Later, the effect of the task prior on current performance is investigated with various task orders. The proposed approach is evaluated on permutedMNIST and selected subtasks from CIFAR-100 dataset. Results show that sparse Group LASSO regularization effectively sparsifies the progressive neural networks and the task sequence order affects the performance.

Original languageEnglish
Title of host publicationAdvances in Computational Collective Intelligence - 13th International Conference, ICCCI 2021, Proceedings
EditorsKrystian Wojtkiewicz, Jan Treur, Elias Pimenidis, Marcin Maleszka
PublisherSpringer Science and Business Media Deutschland GmbH
Pages715-725
Number of pages11
ISBN (Print)9783030881122
DOIs
Publication statusPublished - 2021
Event13th International Conference on Computational Collective Intelligence, ICCCI 2021 - Virtual, Online
Duration: 29 Sept 20211 Oct 2021

Publication series

NameCommunications in Computer and Information Science
Volume1463
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937

Conference

Conference13th International Conference on Computational Collective Intelligence, ICCCI 2021
CityVirtual, Online
Period29/09/211/10/21

Bibliographical note

Publisher Copyright:
© 2021, Springer Nature Switzerland AG.

Keywords

  • Continual learning
  • Progressive Neural Networks
  • Sparse group LASSO regularization

Fingerprint

Dive into the research topics of 'Sparse Progressive Neural Networks for Continual Learning'. Together they form a unique fingerprint.

Cite this