TY - JOUR
T1 - Nonlinear system modeling with random matrices
T2 - Echo state networks revisited
AU - Zhang, Bai
AU - Miller, David J.
AU - Wang, Yue
N1 - Funding Information:
Manuscript received January 6, 2011; revised November 1, 2011; accepted November 2, 2011. Date of publication December 15, 2011; date of current version January 5, 2012. This work was supported in part by the National Institutes of Health, under Grant CA149147 and Grant NS029525.
PY - 2012
Y1 - 2012
N2 - Echo state networks (ESNs) are a novel form of recurrent neural networks (RNNs) that provide an efficient and powerful computational model approximating nonlinear dynamical systems. A unique feature of an ESN is that a large number of neurons (the 'reservoir') are used, whose synaptic connections are generated randomly, with only the connections from the reservoir to the output modified by learning. Why a large randomly generated fixed RNN gives such excellent performance in approximating nonlinear systems is still not well understood. In this brief, we apply random matrix theory to examine the properties of random reservoirs in ESNs under different topologies (sparse or fully connected) and connection weights (Bernoulli or Gaussian). We quantify the asymptotic gap between the scaling factor bounds for the necessary and sufficient conditions previously proposed for the echo state property. We then show that the state transition mapping is contractive with high probability when only the necessary condition is satisfied, which corroborates and thus analytically explains the observation that in practice one obtains echo states when the spectral radius of the reservoir weight matrix is smaller than 1.
AB - Echo state networks (ESNs) are a novel form of recurrent neural networks (RNNs) that provide an efficient and powerful computational model approximating nonlinear dynamical systems. A unique feature of an ESN is that a large number of neurons (the 'reservoir') are used, whose synaptic connections are generated randomly, with only the connections from the reservoir to the output modified by learning. Why a large randomly generated fixed RNN gives such excellent performance in approximating nonlinear systems is still not well understood. In this brief, we apply random matrix theory to examine the properties of random reservoirs in ESNs under different topologies (sparse or fully connected) and connection weights (Bernoulli or Gaussian). We quantify the asymptotic gap between the scaling factor bounds for the necessary and sufficient conditions previously proposed for the echo state property. We then show that the state transition mapping is contractive with high probability when only the necessary condition is satisfied, which corroborates and thus analytically explains the observation that in practice one obtains echo states when the spectral radius of the reservoir weight matrix is smaller than 1.
UR - http://www.scopus.com/inward/record.url?scp=84866504536&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84866504536&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2011.2178562
DO - 10.1109/TNNLS.2011.2178562
M3 - Article
C2 - 24808467
AN - SCOPUS:84866504536
SN - 2162-237X
VL - 23
SP - 175
EP - 182
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 1
M1 - 6105577
ER -