JOURNAL ARTICLE

A Latent Variable Recurrent Neural Network for Discourse-Driven Language Models

Abstract

This paper presents a novel latent variable recurrent neural network architecture for jointly modeling sequences of words and (possibly latent) discourse relations between adjacent sentences.A recurrent neural network generates individual words, thus reaping the benefits of discriminatively-trained vector representations.The discourse relations are represented with a latent variable, which can be predicted or marginalized, depending on the task.The resulting model can therefore employ a training objective that includes not only discourse relation classification, but also word prediction.As a result, it outperforms state-ofthe-art alternatives for two tasks: implicit discourse relation classification in the Penn Discourse Treebank, and dialog act classification in the Switchboard corpus.Furthermore, by marginalizing over latent discourse relations at test time, we obtain a discourse informed language model, which improves over a strong LSTM baseline.

Keywords:
Latent variable Computer science Variable (mathematics) Artificial intelligence Artificial neural network Natural language processing Recurrent neural network Linguistics Mathematics Philosophy

Metrics

109
Cited By
21.70
FWCI (Field Weighted Citation Impact)
55
Refs
1.00
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Speech and dialogue systems
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.