Online sufficient dimension reduction through sliced inverse regression

Zhanrui Cai, Runze Li, Liping Zhu

Research output: Contribution to journalArticlepeer-review

13 Scopus citations


Sliced inverse regression is an effective paradigm that achieves the goal of dimension reduction through replacing high dimensional covariates with a small number of linear combinations. It does not impose parametric assumptions on the dependence structure. More importantly, such a reduction of dimension is sufficient in that it does not cause loss of information. In this paper, we adapt the stationary sliced inverse regression to cope with the rapidly changing environments. We propose to implement sliced inverse regression in an online fashion. This online learner consists of two steps. In the first step we construct an online estimate for the kernel matrix; in the second step we propose two online algorithms, one is motivated by the perturbation method and the other is originated from the gradient descent optimization, to perform online singular value decomposition. The theoretical properties of this online learner are established. We demonstrate the numerical performance of this online learner through simulations and real world applications. All numerical studies confirm that this online learner performs as well as the batch learner.

Original languageEnglish (US)
JournalJournal of Machine Learning Research
StatePublished - Jan 1 2020

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence
  • Control and Systems Engineering
  • Statistics and Probability


Dive into the research topics of 'Online sufficient dimension reduction through sliced inverse regression'. Together they form a unique fingerprint.

Cite this