JOURNAL ARTICLE

Sparse Gaussian Conditional Random Fields on Top of Recurrent Neural Networks

Xishun WangMinjie ZhangFenghui Ren

Year: 2018 Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Vol: 32 (1)   Publisher: Association for the Advancement of Artificial Intelligence

Abstract

Predictions of time-series are widely used in different disciplines. We propose CoR, Sparse Gaussian Conditional Random Fields (SGCRF) on top of Recurrent Neural Networks (RNN), for problems of this kind. CoR gains advantages from both RNN and SGCRF. It can not only effectively represent the temporal correlations in observed data, but can also learn the structured information of the output. CoR is challenging to train because it is a hybrid of deep neural networks and densely-connected graphical models. Alternative training can be a tractable way to train CoR, and furthermore, an end-to-end training method is proposed to train CoR more efficiently. CoR is evaluated by both synthetic data and real-world data, and it shows a significant improvement in performance over state-of-the-art methods.

Keywords:
Recurrent neural network Computer science Gaussian Conditional random field Artificial intelligence Artificial neural network Graphical model Machine learning Synthetic data Series (stratigraphy) Algorithm Pattern recognition (psychology)

Metrics

5
Cited By
0.68
FWCI (Field Weighted Citation Impact)
30
Refs
0.63
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Time Series Analysis and Forecasting
Physical Sciences →  Computer Science →  Signal Processing
Gaussian Processes and Bayesian Inference
Physical Sciences →  Computer Science →  Artificial Intelligence
Anomaly Detection Techniques and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.