TY - JOUR
T1 - A general theory for nonlinear sufficient dimension reduction
T2 - Formulation and estimation
AU - Lee, Kuang Yao
AU - Li, Bing
AU - Chiaromonte, Francesca
N1 - Copyright:
Copyright 2013 Elsevier B.V., All rights reserved.
PY - 2013/2
Y1 - 2013/2
N2 - In this paper we introduce a general theory for nonlinear sufficient dimension reduction, and explore its ramifications and scope. This theory subsumes recent work employing reproducing kernel Hilbert spaces, and reveals many parallels between linear and nonlinear sufficient dimension reduction. Using these parallels we analyze the properties of existing methods and develop new ones. We begin by characterizing dimension reduction at the general level of σ-fields and proceed to that of classes of functions, leading to the notions of sufficient, complete and central dimension reduction classes. We show that, when it exists, the complete and sufficient class coincides with the central class, and can be unbiasedly and exhaustively estimated by a generalized sliced inverse regression estimator (GSIR). When completeness does not hold, this estimator captures only part of the central class. However, in these cases we show that a generalized sliced average variance estimator (GSAVE) can capture a larger portion of the class. Both estimators require no numerical optimization because they can be computed by spectral decomposition of linear operators. Finally, we compare our estimators with existing methods by simulation and on actual data sets.
AB - In this paper we introduce a general theory for nonlinear sufficient dimension reduction, and explore its ramifications and scope. This theory subsumes recent work employing reproducing kernel Hilbert spaces, and reveals many parallels between linear and nonlinear sufficient dimension reduction. Using these parallels we analyze the properties of existing methods and develop new ones. We begin by characterizing dimension reduction at the general level of σ-fields and proceed to that of classes of functions, leading to the notions of sufficient, complete and central dimension reduction classes. We show that, when it exists, the complete and sufficient class coincides with the central class, and can be unbiasedly and exhaustively estimated by a generalized sliced inverse regression estimator (GSIR). When completeness does not hold, this estimator captures only part of the central class. However, in these cases we show that a generalized sliced average variance estimator (GSAVE) can capture a larger portion of the class. Both estimators require no numerical optimization because they can be computed by spectral decomposition of linear operators. Finally, we compare our estimators with existing methods by simulation and on actual data sets.
UR - http://www.scopus.com/inward/record.url?scp=84878262605&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84878262605&partnerID=8YFLogxK
U2 - 10.1214/12-AOS1071
DO - 10.1214/12-AOS1071
M3 - Article
AN - SCOPUS:84878262605
SN - 0090-5364
VL - 41
SP - 221
EP - 249
JO - Annals of Statistics
JF - Annals of Statistics
IS - 1
ER -