JOURNAL ARTICLE

Long short-term memory neural networks for one-step time series forecasting

Abstract

This master thesis aims to employ Long Short-Term Memory (LSTM) neural networks for one-step and multi-step time series forecasts. For this endeavor, a software stack, including a deep learning framework is selected and different machine learning- and statistical models are implemented. The performance of the LSTM approaches is compared to carefully chosen benchmark methods on an exemplary real-world problem and the experiments are run on powerful, cloud-based machines. To provide a methodological framework for time series forecasting projects, a seven-phase process model is elaborated. Further, to allow for model selection of computationally intensive deep learning methods under limited resources, a modified form of blocked cross-validation, together with a multi-stage Bayesian hyperparameter optimization approach is proposed. The proof of concept of the pro-posed methodology is conducted on the real-world problem in the domain of electricity demand forecasting. The implemented LSTM model clearly outperformed the benchmark models on all per-formance measures in the one-step walk forward out-of-sample test and showed a roughly 10% low-er root-mean square error than the second-best model which utilized double seasonal Holt-Winters exponential smoothing. Inspired by work in the area of natural language processing, for the multi-step scenario, an encoder-decoder LSTM neural network was implemented, as simpler architectures showed disappointing results. Also, the multi-step LSTM forecaster proofed to be a competitive ap-proach, but the purely statistical model had the lead. However, due to resource constraints, it was not possible to retrieve statements on the same validity level as for the one-step case. By comparing the LSTM-forecaster to the predictive performance of simple recurrent neural networks, the added value of the more complex, gated cell architecture of the LSTM has been indicated. A downside of LSTM neural networks is the relatively long training time which can be a problem for exhaustive hy-perparameter searches. On the other hand, LSTM neural networks showed to have good generaliza-tion ability and needed comparably infrequent retraining.

Keywords:
Benchmark (surveying) Artificial neural network Hyperparameter Time series Process (computing) Series (stratigraphy) Bayesian probability Deep learning Recurrent neural network

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.31
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Energy Load and Power Forecasting
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Stock Market Forecasting Methods
Social Sciences →  Decision Sciences →  Management Science and Operations Research
Forecasting Techniques and Applications
Social Sciences →  Decision Sciences →  Management Science and Operations Research

Related Documents

JOURNAL ARTICLE

Forecasting time series with long short-term memory networks

N.Q. DungPhạm MinhIvan Zelinka

Journal:   Can Tho University Journal of Science Year: 2020 Vol: Vol.12(2) Pages: 53-53
JOURNAL ARTICLE

SHORT-TERM FINANCIAL TIME SERIES ANALYSIS WITH LONG SHORT-TERM MEMORY NEURAL NETWORKS

M. V. Labusov

Journal:   EKONOMIKA I UPRAVLENIE PROBLEMY RESHENIYA Year: 2021 Vol: 3 (4)Pages: 165-177
BOOK-CHAPTER

Time Series Forecasting Using Long Short-Term Memory Neural Networks: A Case Study of Seismogram

Hilal H. NuhaMohamed MohandesBo LiuAli Al‐Shaikhi

Advances in Science, Technology & Innovation/Advances in science, technology & innovation Year: 2022 Pages: 207-209
© 2026 ScienceGate Book Chapters — All rights reserved.