Abstract
Incremental learning empowers models to continuously acquire knowledge of new classes while retaining previously learned information. However, catastrophic forgetting and class imbalance often impede this process, especially when new classes are introduced sequentially. We propose a hybrid method that integrates Elastic Weight Consolidation (EWC) with a shared encoder architecture to overcome these obstacles. This approach provides robust feature extraction, while EWC safeguards vital parameters and preserves prior knowledge. Moreover, task-specific output layers enable flexible adaptation to new classes. We evaluated our method using the CICIoT2023 dataset, a class-incremental IoT anomaly detection benchmark. Our results demonstrated a 15.3% improvement in the macro F1-score and a 1.28% increase in overall accuracy compared to a baseline model that did not incorporate EWC, with particular advantages for underrepresented classes. These findings underscore the effectiveness of the EWC-assisted shared encoder framework for class-imbalanced incremental learning in streaming environments.
| Original language | English |
|---|---|
| Article number | 1887 |
| Journal | Mathematics |
| Volume | 13 |
| Issue number | 11 |
| DOIs | |
| Publication status | Published - Jun 2025 |
Bibliographical note
Publisher Copyright:© 2025 by the authors.
Keywords
- catastrophic forgetting
- class imbalance
- common encoder
- continual learning
- incremental learning