Abstract
Many classical dimension reduction methods, especially those based on inverse conditional moments, require the predictors to have elliptical distributions, or at least to satisfy a linearity condition. Such conditions, however, are too strong for some applications. Li and Dong (2009) introduced the notion of the central solution space and used it to modify first-order methods, such as sliced inverse regression, so that they no longer rely on these conditions. In this paper we generalize this idea to second-order methods, such as sliced average variance estimation and directional regression. In doing so we demonstrate that the central solution space is a versatile framework: we can use it to modify essentially all inverse conditional moment-based methods to relax the distributional assumption on the predictors. Simulation studies and an application show a substantial improvement of the modified methods over their classical counterparts.
Original language | English (US) |
---|---|
Pages (from-to) | 279-294 |
Number of pages | 16 |
Journal | Biometrika |
Volume | 97 |
Issue number | 2 |
DOIs | |
State | Published - Jun 2010 |
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- General Mathematics
- Agricultural and Biological Sciences (miscellaneous)
- General Agricultural and Biological Sciences
- Statistics, Probability and Uncertainty
- Applied Mathematics