Second-order recurrent neural networks for grammatical inference

C. L. Giles, D. Chen, C. B. Miller, H. H. Chen, G. Z. Sun, Y. C. Lee

Research output: Chapter in Book/Report/Conference proceedingConference contribution

41 Scopus citations

Abstract

It is shown that a recurrent, second-order neural network using a real-time, feed-forward training algorithm readily learns to infer regular grammars from positive and negative string training samples. Numerous simulations which show the effect of initial conditions, training set size and order, and neuron architecture are presented. All simulations were performed with random initial weight strengths and usually converge after approximately a hundred epochs of training. The authors discuss a quantization algorithm for dynamically extracting finite-state automata during and after training. For a well-trained neural net, the extracted automata constitute an equivalence class of state machines that are reducible to the minimal machine of the inferred grammar. It is then shown through simulations that many of the neural net state machines are dynamically stable and correctly classify long unseen strings.

Original languageEnglish (US)
Title of host publicationProceedings. IJCNN - International Joint Conference on Neural Networks
Editors Anon
PublisherPubl by IEEE
Pages273-281
Number of pages9
ISBN (Print)0780301641
StatePublished - 1992
EventInternational Joint Conference on Neural Networks - IJCNN-91-Seattle - Seattle, WA, USA
Duration: Jul 8 1991Jul 12 1991

Publication series

NameProceedings. IJCNN - International Joint Conference on Neural Networks

Other

OtherInternational Joint Conference on Neural Networks - IJCNN-91-Seattle
CitySeattle, WA, USA
Period7/8/917/12/91

All Science Journal Classification (ASJC) codes

  • General Engineering

Fingerprint

Dive into the research topics of 'Second-order recurrent neural networks for grammatical inference'. Together they form a unique fingerprint.

Cite this