A jackknife type approach to statistical model selection

Hyunsook Lee, G. Jogesh Babu, C. R. Rao

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Procedures such as Akaike information criterion (AIC), Bayesian information criterion (BIC), minimum description length (MDL), and bootstrap information criterion have been developed in the statistical literature for model selection. Most of these methods use estimation of bias. This bias, which is inevitable in model selection problems, arises from estimating the distance between an unknown true model and an estimated model. Instead of bias estimation, a bias reduction based on jackknife type procedure is developed in this paper. The jackknife method selects a model of minimum Kullback-Leibler divergence through bias reduction. It is shown that (a) the jackknife maximum likelihood estimator is consistent, (b) the jackknife estimate of the log likelihood is asymptotically unbiased, and (c) the stochastic order of the jackknife log likelihood estimate is O(loglogn). Because of these properties, the jackknife information criterion is applicable to problems of choosing a model from separated families especially when the true model is unknown. Compared to popular information criteria which are only applicable to nested models such as regression and time series settings, the jackknife information criterion is more robust in terms of filtering various types of candidate models in choosing the best approximating model.

Original languageEnglish (US)
Pages (from-to)301-311
Number of pages11
JournalJournal of Statistical Planning and Inference
Volume142
Issue number1
DOIs
StatePublished - Jan 2012

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'A jackknife type approach to statistical model selection'. Together they form a unique fingerprint.

Cite this