Zelong ZhuChunna ZhaoYaqun Huang
Abstract Time series data prediction holds a significant importance in various applications. In this study, we specifically concentrate on long-time series data prediction. Recurrent Neural Networks are widely recognized as a fundamental neural network architecture for processing effectively time-series data. Recurrent Neural Network models encounter the gradient disappearance or gradient explosion challenge in long series data. To resolve the gradient problem and improve accuracy, the Fractional Order Lipschitz Recurrent Neural Network (FOLRNN) model is proposed to predict long time series in this paper. The proposed method uses the Lipschitz continuity to alleviate the gradient problem. The fractional order integration is applied to compute the hidden states of the Recurrent Neural Network in the proposed method. The intricate dynamics of long-time series data can be captured by fractional order calculus. It has more accurate predictions compared with Lipschitz Recurrent Neural Networks models. Then self-attention is used to improve feature representation. It can describe the correlation of features and improve predict performance. Some experiments show that the FOLRNN model achieves better results than other methods.
Chunna ZhaoJunjie YeZelong ZhuYaqun Huang
Yiguan ShiYong ChenLongjie Zhang
JongHwa KimJong Hoo ChoiChangwan Kang