JOURNAL ARTICLE

Enformer: Encoder-Based Sparse Periodic Self-Attention Time-Series Forecasting

Na WangXianglian Zhao

Year: 2023 Journal:   IEEE Access Vol: 11 Pages: 112004-112014   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Time-series forecasting makes reasonable prediction and planning on stock forecast, traffic flow, power consumption and extreme weather warning through the correlation characteristics of its own multi-dimensional data and time. The proposed Transformer structure in natural language processing is proved to be beneficial for establishing the global dependence between vectors, which is undoubtedly advantageous for the time prediction problem. However, the unchanged introduction of Transformer into the time-series forecasting will also bring corresponding problems, including high computing complexity, vector information redundancy. And bound by the unique processing methods of natural language processing problems, it ignores the common processing methods of time-series forecasting problems. To this end, we design a lightweight Transformer structure that breaks the cascade structure of traditional Transformer encoders-decoders, only using encoder to build the network. At the same time, the CMM (Coarse Matching Module) is proposed to construct the multi-scale output of the input sequence, so that the subsequent networks can capture the dependencies with different time granularity. The self-attention head is also redesigned, called sparse periodic attention, which pays more attention to the periodic dependence of sequence context while reducing the computational complexity. The overall change gives full play to the advantages of Transformer, making it more suitable for time series prediction. Thanks to the synergy of the above innovative work, the accuracy of the algorithm proposed in this paper is improved by 7.1% on average in the testing experiments of multiple public datasets, and the prediction time efficiency is also improved by 44.1%. It is proved that the reasonable improvement of transformer structure in time-series prediction can reduce the amount of calculation and give consideration to the accuracy at the same time.

Keywords:
Computer science Granularity Transformer Time series Encoder Data mining Artificial intelligence Algorithm Machine learning

Metrics

4
Cited By
1.07
FWCI (Field Weighted Citation Impact)
40
Refs
0.74
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Time Series Analysis and Forecasting
Physical Sciences →  Computer Science →  Signal Processing
Stock Market Forecasting Methods
Social Sciences →  Decision Sciences →  Management Science and Operations Research
Energy Load and Power Forecasting
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.