True Risk Bounds for the Regression of Real-Valued Functions

Rhee Man Kil, Imhoi Koo

Research output: Contribution to conferencePaperpeer-review

1 Scopus citations

Abstract

This paper presents a new form of true risk bounds for the regression of real-valued functions. The goal of machine learning is minimizing the true risk (or general error) for the whole distribution of sample space, not just a set of training samples. However, the true risk can not be estimated accurately with the finite number of samples. In this sense, we derive the form of true risk bounds which may provide the useful guideline for the optimization of learning models. Through the simulation for the function approximation, we have shown that the prediction of true risk bounds based on the suggested functional form is well fitted to the empirical data.

Original languageEnglish (US)
Pages507-512
Number of pages6
StatePublished - 2003
EventInternational Joint Conference on Neural Networks 2003 - Portland, OR, United States
Duration: Jul 20 2003Jul 24 2003

Conference

ConferenceInternational Joint Conference on Neural Networks 2003
Country/TerritoryUnited States
CityPortland, OR
Period7/20/037/24/03

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'True Risk Bounds for the Regression of Real-Valued Functions'. Together they form a unique fingerprint.

Cite this