JOURNAL ARTICLE

Time Series Forecasting Based on Convolution Transformer

Na WANGXianglian ZHAO

Year: 2023 Journal:   IEICE Transactions on Information and Systems Vol: E106.D (5)Pages: 976-985   Publisher: Institute of Electronics, Information and Communication Engineers

Abstract

For many fields in real life, time series forecasting is essential. Recent studies have shown that Transformer has certain advantages when dealing with such problems, especially when dealing with long sequence time input and long sequence time forecasting problems. In order to improve the efficiency and local stability of Transformer, these studies combine Transformer and CNN with different structures. However, previous time series forecasting network models based on Transformer cannot make full use of CNN, and they have not been used in a better combination of both. In response to this problem in time series forecasting, we propose the time series forecasting algorithm based on convolution Transformer. (1) ES attention mechanism: Combine external attention with traditional self-attention mechanism through the two-branch network, the computational cost of self-attention mechanism is reduced, and the higher forecasting accuracy is obtained. (2) Frequency enhanced block: A Frequency Enhanced Block is added in front of the ESAttention module, which can capture important structures in time series through frequency domain mapping. (3) Causal dilated convolution: The self-attention mechanism module is connected by replacing the traditional standard convolution layer with a causal dilated convolution layer, so that it obtains the receptive field of exponentially growth without increasing the calculation consumption. (4) Multi-layer feature fusion: The outputs of different self-attention mechanism modules are extracted, and the convolutional layers are used to adjust the size of the feature map for the fusion. The more fine-grained feature information is obtained at negligible computational cost. Experiments on real world datasets show that the time series network forecasting model structure proposed in this paper can greatly improve the real-time forecasting performance of the current state-of-the-art Transformer model, and the calculation and memory costs are significantly lower. Compared with previous algorithms, the proposed algorithm has achieved a greater performance improvement in both effectiveness and forecasting accuracy.

Keywords:
Computer science Transformer Convolution (computer science) Algorithm Artificial intelligence Time series Pattern recognition (psychology) Machine learning Voltage Artificial neural network

Metrics

13
Cited By
3.49
FWCI (Field Weighted Citation Impact)
21
Refs
0.91
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Time Series Analysis and Forecasting
Physical Sciences →  Computer Science →  Signal Processing
Stock Market Forecasting Methods
Social Sciences →  Decision Sciences →  Management Science and Operations Research
Currency Recognition and Detection
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.