Parameterized cross-validation for nonlinear regression models

Imhoi Koo, Namgil Lee, Rhee Man Kil

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

This paper presents a new method of cross-validation (CV) for nonlinear regression problems. In the conventional CV methods, a validation set, that is, a part of training data is used to check the performance of learning. As a result, the trained regression models cannot utilize the whole training data and obtain the less performance than the expected for the given training data. In this context, we consider to construct the performance prediction model using the validation set to determine the optimal structure for the whole training data. We analyze risk bounds using the VC dimension theory and suggest a parameterized form of risk estimates for the performance prediction model. As a result, we can estimate the optimal structure for the whole training data using the suggested CV method referred to as the parameterize CV (p-CV) method. Through the simulation for function approximation, we have shown the effectiveness of our approach.

Original languageEnglish (US)
Pages (from-to)3089-3095
Number of pages7
JournalNeurocomputing
Volume71
Issue number16-18
DOIs
StatePublished - Oct 2008

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Parameterized cross-validation for nonlinear regression models'. Together they form a unique fingerprint.

Cite this