Abstract
There has been much interest in learning long-term temporal dependencies with neural networks. Adequately learning such long-term information can be useful in many problems in signal processing, control and prediction. A class of recurrent neural networks (RNNs), NARX neural networks, were shown to perform much better than other recurrent neural networks when learning simple long-term dependency problems. The intuitive explanation is that the output memories of a NARX network can be manifested as jump-ahead connections in the time-unfolded network. Here we show that similar improvements in learning long-term dependencies can be achieved with other classes of recurrent neural network architectures simply by increasing the order of the embedded memory. Experiments with locally recurrent networks, and NARX (output feedback) networks show that all of these classes of network architectures can have a significant improvement on learning long-term dependencies as the orders of embedded memory are increased, other things be held constant. These results can be important to a user comfortable with a specific recurrent neural network architecture because simply increasing the embedding memory order of that architecture will make it more robust to the problem of long-term dependency learning.
Original language | English (US) |
---|---|
Title of host publication | Neural Networks for Signal Processing - Proceedings of the IEEE Workshop |
Publisher | IEEE |
Pages | 34-43 |
Number of pages | 10 |
State | Published - 1997 |
Event | Proceedings of the 1997 7th IEEE Workshop on Neural Networks for Signal Processing, NNSP'97 - Amelia Island, FL, USA Duration: Sep 24 1997 → Sep 26 1997 |
Other
Other | Proceedings of the 1997 7th IEEE Workshop on Neural Networks for Signal Processing, NNSP'97 |
---|---|
City | Amelia Island, FL, USA |
Period | 9/24/97 → 9/26/97 |
All Science Journal Classification (ASJC) codes
- Signal Processing
- Software
- Electrical and Electronic Engineering