TY - JOUR

T1 - Sufficient forecasting using factor models

AU - Fan, Jianqing

AU - Xue, Lingzhou

AU - Yao, Jiawei

N1 - Funding Information:
The authors are grateful to the editors and referees for their constructive comments. The authors thank Stephane Bonhomme, A. Ronald Gallant, Arthur Lewbel, Yuan Liao, Rosa Matzkin, Jose Luis Montiel Olea, and participants at Penn State University (Econometrics), New York University (Stern), Peking University (Statistics), University of South Carolina (Statistics), and The Institute for Fiscal Studies for their helpful suggestions. This paper is supported by National Institutes of Health grants R01-GM072611 and R01GM100474-04 and National Science Foundation grants DMS-1206464 , DMS-1308566 , and DMS-1505256 . Appendix We first cite two lemmas from Fan et al. (2013) , which are needed subsequently in the proofs. Lemma A.1 Suppose A and B are two semi-positive definite matrices, and that λ min ( A ) > c p , T for some sequence c p , T > 0 . If ‖ A − B ‖ = o p ( c p , T ) , then ‖ A − 1 − B − 1 ‖ = O p ( c p , T − 2 ) ‖ A − B ‖ . Lemma A.2 Let { λ i } i = 1 p be the eigenvalues of Σ in descending order and { ξ } i = 1 p be their associated eigenvectors. Correspondingly, let { λ ̂ i } i = 1 p be the eigenvalues of Σ ̂ in descending order and { ξ ̂ } i = 1 p be their associated eigenvectors. Then, (a) Weyl’s Theorem: | λ ̂ i − λ i | ≤ ‖ Σ ̂ − Σ ‖ . (b) sin ( θ ) Theorem (Davis and Kahan, 1970) : ‖ ξ ̂ i − ξ i ‖ ≤ 2 ⋅ ‖ Σ ̂ − Σ ‖ min ( | λ ̂ i − 1 − λ i | , | λ i − λ ̂ i + 1 | ) . A.1
Publisher Copyright:
© 2017 Elsevier B.V.

PY - 2017/12

Y1 - 2017/12

N2 - We consider forecasting a single time series when there is a large number of predictors and a possible nonlinear effect. The dimensionality was first reduced via a high-dimensional factor model implemented by the principal component analysis. Using the extracted factors, we develop a novel forecasting method called the sufficient forecasting, which provides a set of sufficient predictive indices, inferred from high-dimensional predictors, to deliver additional predictive power. The projected principal component analysis will be employed to enhance the accuracy of inferred factors when a semi-parametric factor model is assumed. Our method is also applicable to cross-sectional sufficient regression using extracted factors. The connection between the sufficient forecasting and the deep learning architecture is explicitly stated. The sufficient forecasting correctly estimates projection indices of the underlying factors even in the presence of a nonparametric forecasting function. The proposed method extends the sufficient dimension reduction to high-dimensional regimes by condensing the cross-sectional information through factor models. We derive asymptotic properties for the estimate of the central subspace spanned by these projection directions as well as the estimates of the sufficient predictive indices. We further show that the natural method of running multiple regression of target on estimated factors yields a linear estimate that actually falls into this central subspace. Our method and theory allow the number of predictors to be larger than the number of observations. We finally demonstrate that the sufficient forecasting improves upon the linear forecasting in both simulation studies and an empirical study of forecasting macroeconomic variables.

AB - We consider forecasting a single time series when there is a large number of predictors and a possible nonlinear effect. The dimensionality was first reduced via a high-dimensional factor model implemented by the principal component analysis. Using the extracted factors, we develop a novel forecasting method called the sufficient forecasting, which provides a set of sufficient predictive indices, inferred from high-dimensional predictors, to deliver additional predictive power. The projected principal component analysis will be employed to enhance the accuracy of inferred factors when a semi-parametric factor model is assumed. Our method is also applicable to cross-sectional sufficient regression using extracted factors. The connection between the sufficient forecasting and the deep learning architecture is explicitly stated. The sufficient forecasting correctly estimates projection indices of the underlying factors even in the presence of a nonparametric forecasting function. The proposed method extends the sufficient dimension reduction to high-dimensional regimes by condensing the cross-sectional information through factor models. We derive asymptotic properties for the estimate of the central subspace spanned by these projection directions as well as the estimates of the sufficient predictive indices. We further show that the natural method of running multiple regression of target on estimated factors yields a linear estimate that actually falls into this central subspace. Our method and theory allow the number of predictors to be larger than the number of observations. We finally demonstrate that the sufficient forecasting improves upon the linear forecasting in both simulation studies and an empirical study of forecasting macroeconomic variables.

UR - http://www.scopus.com/inward/record.url?scp=85029445810&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85029445810&partnerID=8YFLogxK

U2 - 10.1016/j.jeconom.2017.08.009

DO - 10.1016/j.jeconom.2017.08.009

M3 - Article

C2 - 29731537

AN - SCOPUS:85029445810

SN - 0304-4076

VL - 201

SP - 292

EP - 306

JO - Journal of Econometrics

JF - Journal of Econometrics

IS - 2

ER -