Abstract
The experimental results in this paper demonstrate that a simple pruning/retraining method effectively improves the generalization performance of recurrent neural networks trained to recognize regular languages. The technique also permits the extraction of symbolic knowledge in the form of deterministic finite-state automata (DFA's) which are more consistent with the rules to be learned. Weight decay has also been shown to improve a network's generalization performance. Simulations with two small DFA's (≤10 states) and a large finite-memory machine (64 states) demonstrate that the performance improvement due to pruning/retraining is generally superior to the improvement due to training with weight decay. In addition, there is no need to guess a `good' decay rate.
Original language | English (US) |
---|---|
Title of host publication | Neural Networks for Signal Processing - Proceedings of the IEEE Workshop |
Publisher | IEEE |
Pages | 690-699 |
Number of pages | 10 |
State | Published - 1994 |
Event | Proceedings of the 4th IEEE Workshop on Neural Networks for Signal Processing (NNSP'94) - Ermioni, GREECE Duration: Sep 6 1994 → Sep 8 1994 |
Other
Other | Proceedings of the 4th IEEE Workshop on Neural Networks for Signal Processing (NNSP'94) |
---|---|
City | Ermioni, GREECE |
Period | 9/6/94 → 9/8/94 |
All Science Journal Classification (ASJC) codes
- Signal Processing
- Software
- Electrical and Electronic Engineering