Kernel density-based linear regression estimate

Weixin Yao, Zhibiao Zhao

Research output: Contribution to journalArticlepeer-review

19 Scopus citations

Abstract

For linear regression models with non normally distributed errors, the least squares estimate (LSE) will lose some efficiency compared to the maximum likelihood estimate (MLE). In this article, we propose a kernel density-based regression estimate (KDRE) that is adaptive to the unknown error distribution. The key idea is to approximate the likelihood function by using a nonparametric kernel density estimate of the error density based on some initial parameter estimate. The proposed estimate is shown to be asymptotically as efficient as the oracle MLE which assumes the error density were known. In addition, we propose an EM type algorithm to maximize the estimated likelihood function and show that the KDRE can be considered as an iterated weighted least squares estimate, which provides us some insights on the adaptiveness of KDRE to the unknown error distribution. Our Monte Carlo simulation studies show that, while comparable to the traditional LSE for normal errors, the proposed estimation procedure can have substantial efficiency gain for non normal errors. Moreover, the efficiency gain can be achieved even for a small sample size.

Original languageEnglish (US)
Pages (from-to)4499-4512
Number of pages14
JournalCommunications in Statistics - Theory and Methods
Volume42
Issue number24
DOIs
StatePublished - Dec 17 2013

All Science Journal Classification (ASJC) codes

  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Kernel density-based linear regression estimate'. Together they form a unique fingerprint.

Cite this