Abstract
Recurrent networks have become popular models for system identification and time series prediction. NARX (Nonlinear AutoRegressive models with eXogenous inputs) network models are a popular subclass of recurrent networks and have been used in many applications. Though embedded memory can be found in all recurrent network models, it is particularly prominent in NARX models. We show that using intelligent memory order selection through pruning and good initial heuristics significantly improves the generalization and predictive performance of these nonlinear systems on problems as diverse as grammatical inference and time series prediction.
Original language | English (US) |
---|---|
Title of host publication | IEEE World Congress on Computational Intelligence |
Editors | Anon |
Publisher | IEEE |
Pages | 1834-1839 |
Number of pages | 6 |
Volume | 3 |
State | Published - 1998 |
Event | Proceedings of the 1998 IEEE International Joint Conference on Neural Networks. Part 1 (of 3) - Anchorage, AK, USA Duration: May 4 1998 → May 9 1998 |
Other
Other | Proceedings of the 1998 IEEE International Joint Conference on Neural Networks. Part 1 (of 3) |
---|---|
City | Anchorage, AK, USA |
Period | 5/4/98 → 5/9/98 |
All Science Journal Classification (ASJC) codes
- Software