Abstract
As an alternative to resource-intensive deep learning approaches to the continual learning problem, we propose a simple, fast algorithm inspired by adaptive resonance theory (ART). To cope with the curse of dimensionality and avoid catastrophic forgetting, we apply incremental principal component analysis (IPCA) to the model’s previously learned weights. Experiments show that this approach approximates the performance achieved using static PCA and is competitive with continual deep learning methods. Our implementation is available on https://github.com/neil-ash/ART-IPCA.
| Original language | English (US) |
|---|---|
| State | Published - 2023 |
| Event | 1st Tiny Papers at 11th International Conference on Learning Representations, Tiny Papers @ ICLR 2023 - Kigali, Rwanda Duration: May 5 2023 → May 5 2023 |
Conference
| Conference | 1st Tiny Papers at 11th International Conference on Learning Representations, Tiny Papers @ ICLR 2023 |
|---|---|
| Country/Territory | Rwanda |
| City | Kigali |
| Period | 5/5/23 → 5/5/23 |
All Science Journal Classification (ASJC) codes
- Linguistics and Language
- Language and Linguistics
- Computer Science Applications
- Education
Fingerprint
Dive into the research topics of 'A SIMPLE, FAST ALGORITHM FOR CONTINUAL LEARNING FROM HIGH-DIMENSIONAL DATA'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver