Abstract
This paper presents a new method of regression based on a network with Gaussian kernel functions (GKFs) which is trained from a set of training samples for function approximation. For the regression of a network with GKFs, the error confidence interval defined by the absolute value of difference between the general and empirical risks, instead of using the validation set, is derived in the sense of probably approximately correct (PAC) learning. However, the coefficients in this theoretical bound is too overestimated and dependent upon the given samples and network models. In this sense, the estimation model of error confidence interval is suggested and the coefficients of the suggested model are estimated for the given samples and network models. And the estimated error confidence intervals are then used to check the training of network model in the sense of minimizing the general error. To show the effectiveness of our approach, the error confidence intervals for the prediction of Mackey-Glass time series are estimated and compared with the experimental results.
Original language | English (US) |
---|---|
Pages | 1762-1766 |
Number of pages | 5 |
State | Published - 2001 |
Event | International Joint Conference on Neural Networks (IJCNN'01) - Washington, DC, United States Duration: Jul 15 2001 → Jul 19 2001 |
Other
Other | International Joint Conference on Neural Networks (IJCNN'01) |
---|---|
Country/Territory | United States |
City | Washington, DC |
Period | 7/15/01 → 7/19/01 |
All Science Journal Classification (ASJC) codes
- Software
- Artificial Intelligence