Effects of Noise on Convergence and Generalization in Recurrent Networks

Kam Jim, Bill G. Horne, C. Lee Giles

Research output: Contribution to conferencePaperpeer-review

20 Scopus citations

Abstract

We introduce and study methods of inserting synaptic noise into dynamically-driven recurrent neural networks and show that applying a controlled amount of noise during training may improve convergence and generalization. In addition, we analyze the effects of each noise parameter (additive vs. multiplicative, cumulative vs. non-cumulative, per time step vs. per string) and predict that best overall performance can be achieved by injecting additive noise at each time step. Extensive simulations on learning the dual parity grammar from temporal strings substantiate these predictions.

Original languageEnglish (US)
Pages649-656
Number of pages8
StatePublished - 1994
Event7th International Conference on Neural Information Processing Systems, NIPS 1994 - Denver, United States
Duration: Jan 1 1994Jan 1 1994

Conference

Conference7th International Conference on Neural Information Processing Systems, NIPS 1994
Country/TerritoryUnited States
CityDenver
Period1/1/941/1/94

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Signal Processing
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Effects of Noise on Convergence and Generalization in Recurrent Networks'. Together they form a unique fingerprint.

Cite this