Seyrek Ilerlemeli Sinir Aglari ile Surekli Ogrenme

Translated title of the contribution: Continual Learning with Sparse Progressive Neural Networks

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Human brain effectively integrates prior knowledge to new skills by transferring experience across tasks without suffering from catastrophic forgetting. In this study, to continuously learn a visual classication task sequence (PermutedMNIST), we employed a neural network model with lateral connections, sparse group Least Absolute Shrinkage And Selection Operator (LASSO) regularization and projection regularization to decrease feature redundancy. We show that encouraging feature novelty on progressive neural networks (PNN) prevents major performance decrease on sparsication, sparsication of a progressive neural network produces fair results and decreases the number of learned task-specic parameters on novel tasks.

Translated title of the contributionContinual Learning with Sparse Progressive Neural Networks
Original languageTurkish
Title of host publication2020 28th Signal Processing and Communications Applications Conference, SIU 2020 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728172064
DOIs
Publication statusPublished - 5 Oct 2020
Event28th Signal Processing and Communications Applications Conference, SIU 2020 - Gaziantep, Turkey
Duration: 5 Oct 20207 Oct 2020

Publication series

Name2020 28th Signal Processing and Communications Applications Conference, SIU 2020 - Proceedings

Conference

Conference28th Signal Processing and Communications Applications Conference, SIU 2020
Country/TerritoryTurkey
CityGaziantep
Period5/10/207/10/20

Bibliographical note

Publisher Copyright:
© 2020 IEEE.

Fingerprint

Dive into the research topics of 'Continual Learning with Sparse Progressive Neural Networks'. Together they form a unique fingerprint.

Cite this