• Media type: E-Book
  • Title: State-of-The-Art Deep Learning Models are Superior for Time Series Forecasting and are Applied Optimally with Iterative Prediction Methods
  • Contributor: murray, cathal john [Author]; Du Bois, Naomi [Author]; Hollywood, Lynsey [Author]; Coyle, Damien [Author]
  • Published: [S.l.]: SSRN, 2023
  • Extent: 1 Online-Ressource (12 p)
  • Language: English
  • DOI: 10.2139/ssrn.4361707
  • Identifier:
  • Keywords: time series prediction ; forecasting ; multi-horizon ; attention ; transformer ; LSTM
  • Origination:
  • Footnote:
  • Description: In recent years many new algorithms have been developed for applications in speech and image processing which may be repurposed for time series prediction. This paper presents a comprehensive comparative analysis of time series forecasting capabilities of eight such state-of-the-art algorithms – namely: Vanilla Long Short-Term Memory(V-LSTM) Gated Recurrent Unit (GRU), Bidirectional LSTM(BD-LSTM), Auto encoder (AE-LSTM), Convolutional Neural Network LSTM(CNN-LSTM), LSTM with convolutional encoder (ConvLSTM), Attention mechanism networks and the Transformer networks.Model performances across ten different benchmark datasets including fields of interests such as finance, weather and sales are evaluated. Direct and iterative prediction methods of forecasting are also comprehensively evaluated. For comprehensive and efficient model optimization, the asynchronous successive halving algorithm (ASHA) is applied in the training folds in a 10 fold cross validation framework. Statistical tests are used to comprehensively compare algorithm performances within and across datasets.We show that whilst there are differences between all models, the differences are insignificant for the top performing models which include the Transformer, Attention, V-LSTM, CNN-LSTM and CV-LSTM. However, the transformer model consistently produces the lowest prediction error. We also show that the iterative multistep ahead prediction method is optimal for long range prediction with these new algorithms
  • Access State: Open Access