Abstract
We propose two new classes of estimators of the sufficient dimension reduction space based on invariant linear operators. Many second-order dimension reduction estimators, such as the Sliced Average Variance Estimate, the Sliced Inverse Regression-II, Contour Regression, and Directional regression, rely on the assumptions of linear conditional mean and constant conditional variance. In this paper we show that, under the conditional mean assumption alone, the candidate matrices for many second-order estimators are invariant for the dimension reduction subspace. As a result, these matrices provide useful information about the dimension reduction subspace-that is, a subset of their eigenvectors spans the dimension reduction subspace. Using this property, we develop two new methods for estimating the central subspace: the Iterative Invariant Transformation and the Nonparametrically Boosted Inverse Regression, the second of which is guaranteed to be.
Original language | English (US) |
---|---|
Title of host publication | Festschrift in Honor of R. Dennis Cook |
Subtitle of host publication | Fifty Years of Contribution to Statistical Science |
Publisher | Springer International Publishing |
Pages | 43-64 |
Number of pages | 22 |
ISBN (Electronic) | 9783030690090 |
ISBN (Print) | 9783030690083 |
DOIs | |
State | Published - Apr 27 2021 |
All Science Journal Classification (ASJC) codes
- General Mathematics
- General Computer Science