Time-series forecasting makes reasonable prediction and planning on stock forecast, traffic flow, power consumption and extreme weather warning through the correlation characteristics of its own multi-dimensional data and time. The proposed Transformer structure in natural language processing is proved to be beneficial for establishing the global dependence between vectors, which is undoubtedly advantageous for the time prediction problem. However, the unchanged introduction of Transformer into the time-series forecasting will also bring corresponding problems, including high computing complexity, vector information redundancy. And bound by the unique processing methods of natural language processing problems, it ignores the common processing methods of time-series forecasting problems. To this end, we design a lightweight Transformer structure that breaks the cascade structure of traditional Transformer encoders-decoders, only using encoder to build the network. At the same time, the CMM (Coarse Matching Module) is proposed to construct the multi-scale output of the input sequence, so that the subsequent networks can capture the dependencies with different time granularity. The self-attention head is also redesigned, called sparse periodic attention, which pays more attention to the periodic dependence of sequence context while reducing the computational complexity. The overall change gives full play to the advantages of Transformer, making it more suitable for time series prediction. Thanks to the synergy of the above innovative work, the accuracy of the algorithm proposed in this paper is improved by 7.1% on average in the testing experiments of multiple public datasets, and the prediction time efficiency is also improved by 44.1%. It is proved that the reasonable improvement of transformer structure in time-series prediction can reduce the amount of calculation and give consideration to the accuracy at the same time.
Siyuan HouFeng YuanZhaorui LiKai LiJing Tao
Shengdong DuTianrui LiYan YangShi‐Jinn Horng
Song JiangTahin SyedXuan ZhuJoshua LevyBoris AronchikYizhou Sun
Xiangxu MengWei LiTarek GaberZheng ZhaoChuhao Chen
Yang WangTeng LiWeizhi LuQihang Cao