JOURNAL ARTICLE

Survey of Transformer-based Time Series Forecasting Methods

Abstract

Time series forecasting,a critical technique for analyzing historical data to predict future trends,has been widely applied in fields such as finance and meteorology.However,traditional methods like the autoregressive moving average model and exponential smoothing face limitations when dealing with nonlinear patterns and capturing long-term dependencies.Recently,Transformer-based approaches,due to their self-attention mechanism,have achieved breakthroughs in natural language processing and computer vision,and have also shown significant promise in time series forecasting.Therefore,exploring how to efficiently apply Transformers to time series prediction has become crucial for advancing this field.This paper first introduces the characte-ristics of time series data and explains the common task categories and evaluation metrics for time series forecasting.It then delves into the basic architecture of the Transformer model and selects Transformer-derived models that have garnered widespread attention in recent years for time series forecasting.These models are categorized based on their modules and architectures,and are compared and analyzed from three perspectives:problem-solving capabilities,innovations,and limitations.Finally,this paper discusses potential future research directions for the application of Transformers in time series forecasting.

Keywords:
Exponential smoothing Time series Series (stratigraphy) Autoregressive model Autoregressive integrated moving average Transformer

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.61
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Automated Road and Building Extraction
Physical Sciences →  Engineering →  Ocean Engineering
GNSS positioning and interference
Physical Sciences →  Engineering →  Aerospace Engineering
Smart Agriculture and AI
Life Sciences →  Agricultural and Biological Sciences →  Plant Science
© 2026 ScienceGate Book Chapters — All rights reserved.