Explicit estimators of parametric functions in nonlinear regression

Research output: Contribution to journalArticlepeer-review

19 Scopus citations


When repetitive estimations are to be made under field conditions using data that follow a nonlinear regression law, a simple polynomial function of the observations has considerable appeal as an estimator. The polynomial estimator of finite degree with smallest average mean squared error is found. Conditions are given such that as degree increases it converges in probability to the Bayes estimator and its average mean squared error converges to the lower bound of all square integrable estimators. In an example, a linear estimator performs better than the maximum likelihood estimator and nearly as well as the Bayes estimator.

Original languageEnglish (US)
Pages (from-to)182-193
Number of pages12
JournalJournal of the American Statistical Association
Issue number369
StatePublished - Mar 1980

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Explicit estimators of parametric functions in nonlinear regression'. Together they form a unique fingerprint.

Cite this