Abstract

As an alternative to resource-intensive deep learning approaches to the continual learning problem, we propose a simple, fast algorithm inspired by adaptive resonance theory (ART). To cope with the curse of dimensionality and avoid catastrophic forgetting, we apply incremental principal component analysis (IPCA) to the model’s previously learned weights. Experiments show that this approach approximates the performance achieved using static PCA and is competitive with continual deep learning methods. Our implementation is available on https://github.com/neil-ash/ART-IPCA.

Original languageEnglish (US)
StatePublished - 2023
Event1st Tiny Papers at 11th International Conference on Learning Representations, Tiny Papers @ ICLR 2023 - Kigali, Rwanda
Duration: May 5 2023May 5 2023

Conference

Conference1st Tiny Papers at 11th International Conference on Learning Representations, Tiny Papers @ ICLR 2023
Country/TerritoryRwanda
CityKigali
Period5/5/235/5/23

All Science Journal Classification (ASJC) codes

  • Linguistics and Language
  • Language and Linguistics
  • Computer Science Applications
  • Education

Fingerprint

Dive into the research topics of 'A SIMPLE, FAST ALGORITHM FOR CONTINUAL LEARNING FROM HIGH-DIMENSIONAL DATA'. Together they form a unique fingerprint.

Cite this