Overcoming Class Imbalance in Incremental Learning Using an Elastic Weight Consolidation-Assisted Common Encoder Approach

Engin Baysal, Cüneyt Bayılmış*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Incremental learning empowers models to continuously acquire knowledge of new classes while retaining previously learned information. However, catastrophic forgetting and class imbalance often impede this process, especially when new classes are introduced sequentially. We propose a hybrid method that integrates Elastic Weight Consolidation (EWC) with a shared encoder architecture to overcome these obstacles. This approach provides robust feature extraction, while EWC safeguards vital parameters and preserves prior knowledge. Moreover, task-specific output layers enable flexible adaptation to new classes. We evaluated our method using the CICIoT2023 dataset, a class-incremental IoT anomaly detection benchmark. Our results demonstrated a 15.3% improvement in the macro F1-score and a 1.28% increase in overall accuracy compared to a baseline model that did not incorporate EWC, with particular advantages for underrepresented classes. These findings underscore the effectiveness of the EWC-assisted shared encoder framework for class-imbalanced incremental learning in streaming environments.

Original languageEnglish
Article number1887
JournalMathematics
Volume13
Issue number11
DOIs
Publication statusPublished - Jun 2025

Bibliographical note

Publisher Copyright:
© 2025 by the authors.

Keywords

  • catastrophic forgetting
  • class imbalance
  • common encoder
  • continual learning
  • incremental learning

Fingerprint

Dive into the research topics of 'Overcoming Class Imbalance in Incremental Learning Using an Elastic Weight Consolidation-Assisted Common Encoder Approach'. Together they form a unique fingerprint.

Cite this