Time-delay neural networks: Representation and induction of finite-state machines

Daniel S. Clouse, C. Lee Giles, Bill G. Horne, Garrison W. Cottrell

Research output: Contribution to journalArticlepeer-review

50 Scopus citations


In this work, we characterize and contrast the capabilities of the general class of time-delay neural networks (TDNN's) with input delay neural networks (IDNN's), the subclass of TDNN's with delays limited to the inputs. Each class of networks is capable of representing the same set of languages, those embodied by the definite memory machines (DMM's), a subclass of finite-state machines. We demonstrate the close affinity between TDNN's and DMM languages by learning a very large DMM (2048 states) using only a few training examples. Even though both architectures are capable of representing the same class of languages, they have distinguishable learning biases. Intuition suggests that general TDNN's which include delays in hidden layers should perform well, compared to IDNN's, on problems in which the output can be expressed as a function on narrow input windows which repeat in time. On the other hand, these general TDNN's should perform poorly when the input windows are wide, or there is little repetition. We confirm these hypotheses via a set of simulations and statistical analysis.

Original languageEnglish (US)
Pages (from-to)1065-1070
Number of pages6
JournalIEEE Transactions on Neural Networks
Issue number5
StatePublished - 1997

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence


Dive into the research topics of 'Time-delay neural networks: Representation and induction of finite-state machines'. Together they form a unique fingerprint.

Cite this