CHEN Jiajun, LIU Bo, LIN Weiwei, ZHENG Jianwen, XIE Jiachen
Time series forecasting,a critical technique for analyzing historical data to predict future trends,has been widely applied in fields such as finance and meteorology.However,traditional methods like the autoregressive moving average model and exponential smoothing face limitations when dealing with nonlinear patterns and capturing long-term dependencies.Recently,Transformer-based approaches,due to their self-attention mechanism,have achieved breakthroughs in natural language processing and computer vision,and have also shown significant promise in time series forecasting.Therefore,exploring how to efficiently apply Transformers to time series prediction has become crucial for advancing this field.This paper first introduces the characte-ristics of time series data and explains the common task categories and evaluation metrics for time series forecasting.It then delves into the basic architecture of the Transformer model and selects Transformer-derived models that have garnered widespread attention in recent years for time series forecasting.These models are categorized based on their modules and architectures,and are compared and analyzed from three perspectives:problem-solving capabilities,innovations,and limitations.Finally,this paper discusses potential future research directions for the application of Transformers in time series forecasting.
Mukesh Kumar BhartiRajesh WadhvaniManasi GyanchandaniMuktesh Gupta
Jingyuan ZhaoFulei ChuLili XieYunhong CheYuyan WuAndrew Burke
Kiefer, NicholasWeyrauch, ArvidÖz, MuhammedStreit, AchimGötz, MarkusDebus, Charlotte
Xing ZhangYanli ChenJinrong MoWenjie DaiJietao ZhengChuan BaiJie Su