Abstract
Human brain effectively integrates prior knowledge to new skills by transferring experience across tasks without suffering from catastrophic forgetting. In this study, to continuously learn a visual classication task sequence (PermutedMNIST), we employed a neural network model with lateral connections, sparse group Least Absolute Shrinkage And Selection Operator (LASSO) regularization and projection regularization to decrease feature redundancy. We show that encouraging feature novelty on progressive neural networks (PNN) prevents major performance decrease on sparsication, sparsication of a progressive neural network produces fair results and decreases the number of learned task-specic parameters on novel tasks.
Translated title of the contribution | Continual Learning with Sparse Progressive Neural Networks |
---|---|
Original language | Turkish |
Title of host publication | 2020 28th Signal Processing and Communications Applications Conference, SIU 2020 - Proceedings |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
ISBN (Electronic) | 9781728172064 |
DOIs | |
Publication status | Published - 5 Oct 2020 |
Event | 28th Signal Processing and Communications Applications Conference, SIU 2020 - Gaziantep, Turkey Duration: 5 Oct 2020 → 7 Oct 2020 |
Publication series
Name | 2020 28th Signal Processing and Communications Applications Conference, SIU 2020 - Proceedings |
---|
Conference
Conference | 28th Signal Processing and Communications Applications Conference, SIU 2020 |
---|---|
Country/Territory | Turkey |
City | Gaziantep |
Period | 5/10/20 → 7/10/20 |
Bibliographical note
Publisher Copyright:© 2020 IEEE.