Turing equivalence of neural networks with second order connection weights

Guo Zheng Sun, Hsing Hen Chen, Yee Chun Lee, C. Lee Giles

Research output: Chapter in Book/Report/Conference proceedingConference contribution

16 Scopus citations

Abstract

It is widely acknowledged that in principle a potentially infinitely large neural network (either in number of neurons or in the precision of a single neural activity) could possess an equivalent computational power as a Turing machine. In the present work, the authors show such an equivalence of Turing machines to several explicitly constructed neural networks. It is proven that for any given Turing machine there exists a recurrent neural network with local, second-order, and uniformly connected weights (i.e., the weights connecting the second-order product of local 'input neurons' with their corresponding 'output neurons') which can simulate it. The numerical implementation and learning of such a neural Turing machine are also discussed.

Original languageEnglish (US)
Title of host publicationProceedings. IJCNN - International Joint Conference on Neural Networks
Editors Anon
PublisherPubl by IEEE
Pages357-362
Number of pages6
ISBN (Print)0780301641
StatePublished - 1992
EventInternational Joint Conference on Neural Networks - IJCNN-91-Seattle - Seattle, WA, USA
Duration: Jul 8 1991Jul 12 1991

Publication series

NameProceedings. IJCNN - International Joint Conference on Neural Networks

Other

OtherInternational Joint Conference on Neural Networks - IJCNN-91-Seattle
CitySeattle, WA, USA
Period7/8/917/12/91

All Science Journal Classification (ASJC) codes

  • General Engineering

Fingerprint

Dive into the research topics of 'Turing equivalence of neural networks with second order connection weights'. Together they form a unique fingerprint.

Cite this