Training Second-Order Recurrent Neural Networks using Hints

Christian W. Omlin, C. Lee Giles

Research output: Chapter in Book/Report/Conference proceedingConference contribution

31 Scopus citations

Abstract

We investigate a method for inserting rules into discrete-time second-order recurrent neural networks which are trained to recognize regular languages. The rules defining regular languages can be expressed in the form of transitions in the corresponding deterministic finite-state automaton. Inserting these rules as hints into networks with second-order connections is straightforward. Our simulation results show that even weak hints seem to improve the convergence time by an order of magnitude.

Original languageEnglish (US)
Title of host publicationProceedings of the 9th International Workshop on Machine Learning, ICML 1992
EditorsDerek H. Sleeman, Peter Edwards
PublisherMorgan Kaufmann Publishers, Inc.
Pages361-366
Number of pages6
ISBN (Electronic)155860247X, 9781558602472
DOIs
StatePublished - 1992
Event9th International Conference on Machine Learning, ICML 1992 - Aberdeen, United Kingdom
Duration: Jul 1 1992Jul 3 1992

Publication series

NameProceedings of the 9th International Workshop on Machine Learning, ICML 1992

Conference

Conference9th International Conference on Machine Learning, ICML 1992
Country/TerritoryUnited Kingdom
CityAberdeen
Period7/1/927/3/92

All Science Journal Classification (ASJC) codes

  • Human-Computer Interaction
  • Software
  • Theoretical Computer Science
  • Artificial Intelligence

Cite this