TY - JOUR
T1 - A jackknife type approach to statistical model selection
AU - Lee, Hyunsook
AU - Jogesh Babu, G.
AU - Rao, C. R.
N1 - Funding Information:
We thank the referee for helpful comments. This work was supported in part by NSF Grant AST-0707833 (P.I.: G.J. Babu).
PY - 2012/1
Y1 - 2012/1
N2 - Procedures such as Akaike information criterion (AIC), Bayesian information criterion (BIC), minimum description length (MDL), and bootstrap information criterion have been developed in the statistical literature for model selection. Most of these methods use estimation of bias. This bias, which is inevitable in model selection problems, arises from estimating the distance between an unknown true model and an estimated model. Instead of bias estimation, a bias reduction based on jackknife type procedure is developed in this paper. The jackknife method selects a model of minimum Kullback-Leibler divergence through bias reduction. It is shown that (a) the jackknife maximum likelihood estimator is consistent, (b) the jackknife estimate of the log likelihood is asymptotically unbiased, and (c) the stochastic order of the jackknife log likelihood estimate is O(loglogn). Because of these properties, the jackknife information criterion is applicable to problems of choosing a model from separated families especially when the true model is unknown. Compared to popular information criteria which are only applicable to nested models such as regression and time series settings, the jackknife information criterion is more robust in terms of filtering various types of candidate models in choosing the best approximating model.
AB - Procedures such as Akaike information criterion (AIC), Bayesian information criterion (BIC), minimum description length (MDL), and bootstrap information criterion have been developed in the statistical literature for model selection. Most of these methods use estimation of bias. This bias, which is inevitable in model selection problems, arises from estimating the distance between an unknown true model and an estimated model. Instead of bias estimation, a bias reduction based on jackknife type procedure is developed in this paper. The jackknife method selects a model of minimum Kullback-Leibler divergence through bias reduction. It is shown that (a) the jackknife maximum likelihood estimator is consistent, (b) the jackknife estimate of the log likelihood is asymptotically unbiased, and (c) the stochastic order of the jackknife log likelihood estimate is O(loglogn). Because of these properties, the jackknife information criterion is applicable to problems of choosing a model from separated families especially when the true model is unknown. Compared to popular information criteria which are only applicable to nested models such as regression and time series settings, the jackknife information criterion is more robust in terms of filtering various types of candidate models in choosing the best approximating model.
UR - http://www.scopus.com/inward/record.url?scp=80052277048&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=80052277048&partnerID=8YFLogxK
U2 - 10.1016/j.jspi.2011.07.017
DO - 10.1016/j.jspi.2011.07.017
M3 - Article
AN - SCOPUS:80052277048
SN - 0378-3758
VL - 142
SP - 301
EP - 311
JO - Journal of Statistical Planning and Inference
JF - Journal of Statistical Planning and Inference
IS - 1
ER -