Online Constrained Meta-Learning: Provable Guarantees for Generalization

Siyuan Xu, Minghui Zhu

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

Meta-learning has attracted attention due to its strong ability to learn experiences from known tasks, which can speed up and enhance the learning process for new tasks. However, most existing meta-learning approaches only can learn from tasks without any constraint. This paper proposes an online constrained meta-learning framework, which continuously learns meta-knowledge from sequential learning tasks, and the learning tasks are subject to hard constraints. Beyond existing meta-learning analyses, we provide the upper bounds of optimality gaps and constraint violations of the deployed task-specific models produced by the proposed framework. These metrics consider both the dynamic regret of online learning and the generalization ability of the task-specific models to unseen data. Moreover, we provide a practical algorithm for the framework and validate its superior effectiveness through experiments conducted on meta-imitation learning and few-shot image classification.

Original languageEnglish (US)
JournalAdvances in Neural Information Processing Systems
Volume36
StatePublished - 2023
Event37th Conference on Neural Information Processing Systems, NeurIPS 2023 - New Orleans, United States
Duration: Dec 10 2023Dec 16 2023

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this