TY - JOUR
T1 - Neural Probabilistic Forecasting of Symbolic Sequences with Long Short-Term Memory
AU - Hauser, Michael
AU - Fu, Yiwei
AU - Phoha, Shashi
AU - Ray, Asok
N1 - Funding Information:
The first author has been supported by PSU/ARL Walker Fellowship. Any opinions, findings, and conclusions or recommendations expressed in this publication are those of the authors and do not necessarily reflect the views of the sponsoring agencies. The authors would like to thank Professor Domenic Santavicca and Mr. Jihang Li for kindly providing the experimental data used in this work.
Funding Information:
• U.S. Air Force Office of Scientific Research (AFOSR) (Grant No. FA9550-15-1-0400).
Publisher Copyright:
© 2018 by ASME.
PY - 2018/8/1
Y1 - 2018/8/1
N2 - This paper makes use of long short-term memory (LSTM) neural networks for forecasting probability distributions of time series in terms of discrete symbols that are quantized from real-valued data. The developed framework formulates the forecasting problem into a probabilistic paradigm as h X × Y [0, 1] such that y Yh (x,y)=1, where X is the finite-dimensional state space, Y is the symbol alphabet, and is the set of model parameters. The proposed method is different from standard formulations (e.g., autoregressive moving average (ARMA)) of time series modeling. The main advantage of formulating the problem in the symbolic setting is that density predictions are obtained without any significantly restrictive assumptions (e.g., second-order statistics). The efficacy of the proposed method has been demonstrated by forecasting probability distributions on chaotic time series data collected from a laboratory-scale experimental apparatus. Three neural architectures are compared, each with 100 different combinations of symbol-alphabet size and forecast length, resulting in a comprehensive evaluation of their relative performances.
AB - This paper makes use of long short-term memory (LSTM) neural networks for forecasting probability distributions of time series in terms of discrete symbols that are quantized from real-valued data. The developed framework formulates the forecasting problem into a probabilistic paradigm as h X × Y [0, 1] such that y Yh (x,y)=1, where X is the finite-dimensional state space, Y is the symbol alphabet, and is the set of model parameters. The proposed method is different from standard formulations (e.g., autoregressive moving average (ARMA)) of time series modeling. The main advantage of formulating the problem in the symbolic setting is that density predictions are obtained without any significantly restrictive assumptions (e.g., second-order statistics). The efficacy of the proposed method has been demonstrated by forecasting probability distributions on chaotic time series data collected from a laboratory-scale experimental apparatus. Three neural architectures are compared, each with 100 different combinations of symbol-alphabet size and forecast length, resulting in a comprehensive evaluation of their relative performances.
UR - http://www.scopus.com/inward/record.url?scp=85044967289&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85044967289&partnerID=8YFLogxK
U2 - 10.1115/1.4039281
DO - 10.1115/1.4039281
M3 - Article
AN - SCOPUS:85044967289
SN - 0022-0434
VL - 140
JO - Journal of Dynamic Systems, Measurement and Control, Transactions of the ASME
JF - Journal of Dynamic Systems, Measurement and Control, Transactions of the ASME
IS - 8
M1 - 084502
ER -