Past is important: A method for determining memory structure in NARX neural networks

C. Lee Giles, Tsungnan Lin, Bill G. Horne, S. Y. Kung

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

Recurrent networks have become popular models for system identification and time series prediction. NARX (Nonlinear AutoRegressive models with eXogenous inputs) network models are a popular subclass of recurrent networks and have been used in many applications. Though embedded memory can be found in all recurrent network models, it is particularly prominent in NARX models. We show that using intelligent memory order selection through pruning and good initial heuristics significantly improves the generalization and predictive performance of these nonlinear systems on problems as diverse as grammatical inference and time series prediction.

Original languageEnglish (US)
Title of host publicationIEEE World Congress on Computational Intelligence
Editors Anon
PublisherIEEE
Pages1834-1839
Number of pages6
Volume3
StatePublished - 1998
EventProceedings of the 1998 IEEE International Joint Conference on Neural Networks. Part 1 (of 3) - Anchorage, AK, USA
Duration: May 4 1998May 9 1998

Other

OtherProceedings of the 1998 IEEE International Joint Conference on Neural Networks. Part 1 (of 3)
CityAnchorage, AK, USA
Period5/4/985/9/98

All Science Journal Classification (ASJC) codes

  • Software

Fingerprint

Dive into the research topics of 'Past is important: A method for determining memory structure in NARX neural networks'. Together they form a unique fingerprint.

Cite this