Abstract
Human brain effectively integrates prior knowledge to new skills by transferring experience across tasks without suffering from catastrophic forgetting. In this study, to continuously learn a visual classification task sequence, we employed a neural network model with lateral connections called Progressive Neural Networks (PNN). We sparsified PNNs with sparse group Least Absolute Shrinkage and Selection Operator (LASSO) and trained conventional PNNs with recursive connections. Later, the effect of the task prior on current performance is investigated with various task orders. The proposed approach is evaluated on permutedMNIST and selected subtasks from CIFAR-100 dataset. Results show that sparse Group LASSO regularization effectively sparsifies the progressive neural networks and the task sequence order affects the performance.
Original language | English |
---|---|
Title of host publication | Advances in Computational Collective Intelligence - 13th International Conference, ICCCI 2021, Proceedings |
Editors | Krystian Wojtkiewicz, Jan Treur, Elias Pimenidis, Marcin Maleszka |
Publisher | Springer Science and Business Media Deutschland GmbH |
Pages | 715-725 |
Number of pages | 11 |
ISBN (Print) | 9783030881122 |
DOIs | |
Publication status | Published - 2021 |
Event | 13th International Conference on Computational Collective Intelligence, ICCCI 2021 - Virtual, Online Duration: 29 Sept 2021 → 1 Oct 2021 |
Publication series
Name | Communications in Computer and Information Science |
---|---|
Volume | 1463 |
ISSN (Print) | 1865-0929 |
ISSN (Electronic) | 1865-0937 |
Conference
Conference | 13th International Conference on Computational Collective Intelligence, ICCCI 2021 |
---|---|
City | Virtual, Online |
Period | 29/09/21 → 1/10/21 |
Bibliographical note
Publisher Copyright:© 2021, Springer Nature Switzerland AG.
Keywords
- Continual learning
- Progressive Neural Networks
- Sparse group LASSO regularization