Neural Probabilistic Forecasting of Symbolic Sequences with Long Short-Term Memory

Michael Hauser, Yiwei Fu, Shashi Phoha, Asok Ray

Research output: Contribution to journalArticlepeer-review

8 Scopus citations


This paper makes use of long short-term memory (LSTM) neural networks for forecasting probability distributions of time series in terms of discrete symbols that are quantized from real-valued data. The developed framework formulates the forecasting problem into a probabilistic paradigm as h X × Y [0, 1] such that y Yh (x,y)=1, where X is the finite-dimensional state space, Y is the symbol alphabet, and is the set of model parameters. The proposed method is different from standard formulations (e.g., autoregressive moving average (ARMA)) of time series modeling. The main advantage of formulating the problem in the symbolic setting is that density predictions are obtained without any significantly restrictive assumptions (e.g., second-order statistics). The efficacy of the proposed method has been demonstrated by forecasting probability distributions on chaotic time series data collected from a laboratory-scale experimental apparatus. Three neural architectures are compared, each with 100 different combinations of symbol-alphabet size and forecast length, resulting in a comprehensive evaluation of their relative performances.

Original languageEnglish (US)
Article number084502
JournalJournal of Dynamic Systems, Measurement and Control, Transactions of the ASME
Issue number8
StatePublished - Aug 1 2018

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Information Systems
  • Instrumentation
  • Mechanical Engineering
  • Computer Science Applications


Dive into the research topics of 'Neural Probabilistic Forecasting of Symbolic Sequences with Long Short-Term Memory'. Together they form a unique fingerprint.

Cite this