This study explores multivariate time series forecasting, centering on the transformer model. It examines the shortcomings of other predictive models like Recurrent Neural Networks (RNN) and Temporal Convolutional Networks (TCN), particularly their inadequacies in handling autocorrelation. The transformer model stands out for its accuracy, thanks to its attention mechanism that focuses on essential parts of the input. The research introduces a novel approach that employs the transformer's architecture for effective feature selection in time series data. A vital aspect of this approach is the use of unsupervised pre-training, which shows superior results compared to traditional fully supervised methods. This advancement underscores the effectiveness of unsupervised learning in time series regression, offering significant benefits for diverse scientific and industrial fields.
Xing ZhangYanli ChenJinrong MoWenjie DaiJietao ZhengChuan BaiJie Su
Jingwei WangJunkai TanYang HanMing Zhao
Chonghao ZhangLinlin ZhaoZhefeng YinZhenguo Zhang