JOURNAL ARTICLE

Transformer and long short-term memory networks for long sequence time sequence forecasting problem

Abstract

The long sequence time-sequence forecasting problem attracts a lot of organizations. Many prediction application scenes are about long sequence time-sequence forecasting problems. Under such circumstances, many researchers have tried to solve these problems by employing some models that have proved efficient in the Natural Language Processing field, like long short term memory networks and Transformers, etc. And there are a lot of improvements based on the primary recurrent neural network, and Transformer. Recently, a model called informer which is made for the LSTF was proposed. This model claimed that it improves prediction performance on the long sequence time-series forecasting problem. But in the later experiments, more and more researchers found that informers still cannot handle all the long sequence time-sequence forecasting problems. This paper is going to look at how datasets effect the performance of different models. The experiment is carried out on the Bitcoin dataset with four features and one output. The result shows that the Informer (transformer-like model) cannot always perform well so that sometimes choosing models with simple architecture may gain better results.

Keywords:
Transformer Computer science Sequence (biology) Time sequence Long short term memory Artificial neural network Recurrent neural network Artificial intelligence Term (time) Machine learning Engineering Voltage Electrical engineering

Metrics

1
Cited By
0.32
FWCI (Field Weighted Citation Impact)
0
Refs
0.54
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Stock Market Forecasting Methods
Social Sciences →  Decision Sciences →  Management Science and Operations Research
Time Series Analysis and Forecasting
Physical Sciences →  Computer Science →  Signal Processing
Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.