Robust matrix estimations meet Frank–Wolfe algorithm

Naimin Jing, Ethan X. Fang, Cheng Yong Tang

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

We consider estimating matrix-valued model parameters with a dedicated focus on their robustness. Our setting concerns large-scale structured data so that a regularization on the matrix’s rank becomes indispensable. Though robust loss functions are expected to be effective, their practical implementations are known difficult due to the non-smooth criterion functions encountered in the optimizations. To meet the challenges, we develop a highly efficient computing scheme taking advantage of the projection-free Frank–Wolfe algorithms that require only the first-order derivative of the criterion function. Our methodological framework is broad, extensively accommodating robust loss functions in conjunction with penalty functions in the context of matrix estimation problems. We establish the non-asymptotic error bounds of the matrix estimations with the Huber loss and nuclear norm penalty in two concrete cases: matrix completion with partial and noisy observations and reduced-rank regressions. Our theory demonstrates the merits from using robust loss functions, so that matrix-valued estimators with good properties are achieved even when heavy-tailed distributions are involved. We illustrate the promising performance of our methods with extensive numerical examples and data analysis.

Original languageEnglish (US)
Pages (from-to)2723-2760
Number of pages38
JournalMachine Learning
Volume112
Issue number7
DOIs
StatePublished - Jul 2023

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Robust matrix estimations meet Frank–Wolfe algorithm'. Together they form a unique fingerprint.

Cite this