Noisy time series prediction using recurrent neural networks and grammatical inference

C. Lee Giles, Steve Lawrence, Ah Chung Tsoi

Research output: Contribution to journalArticlepeer-review

298 Scopus citations

Abstract

Financial forecasting is an example of a signal processing problem which is challenging due to small sample sizes, high noise, non-stationarily, and non-linearity. Neural networks have been very successful in a number of signal processing applications. We discuss fundamental limitations and inherent difficulties when using neural networks for the processing of high noise, small sample size signals. We introduce a new intelligent signal processing method which addresses the difficulties. The method proposed uses conversion into a symbolic representation with a self-organizing map, and grammatical inference with recurrent neural networks. We apply the method to the prediction of daily foreign exchange rates, addressing difficulties with non-stationarily, overfitting, and unequal a priori class probabilities, and we find significant predictability in comprehensive experiments covering 5 different foreign exchange rates. The method correctly predicts the direction of change for the next day with an error rate of 47.1%. The error rate reduces to around 40% when rejecting examples where the system has low confidence in its prediction. We show that the symbolic representation aids the extraction of symbolic knowledge from the trained recurrent neural networks in the form of deterministic finite state automata. These automata explain the operation of the system and are often relatively simple. Automata rules related to well known behavior such as trend following and mean reversal are extracted.

Original languageEnglish (US)
Pages (from-to)161-183
Number of pages23
JournalMachine Learning
Volume44
Issue number1-2
DOIs
StatePublished - Jul 2001

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Noisy time series prediction using recurrent neural networks and grammatical inference'. Together they form a unique fingerprint.

Cite this