JOURNAL ARTICLE

Correlation Encoder-Decoder Model for Text Generation

Abstract

Text generation is crucial for many applications in natural language processing. With the prevalence of deep learning, the encoder-decoder architecture is dominantly adopted for this task. Accurately encoding the source information is of key importance to text generation, because the target text can be generated only when accurate and complete source information is captured by the encoder and fed into the decoder. However, most existing approaches fail to effectively encode and learn the entire source information, as some features are easy to be missed along with the encoding procedures of the encoder. Similar problems also confuse the implementation of the decoder. How to reduce the problem of information loss in the encoder-decoder model is critical for text generation. To address this issue, we propose a novel correlation encoder-decoder model, which optimizes both the encoder and the decoder to reduce the problem of information loss by enforcing them to minimize the differences between hierarchical layers by maximizing the mutual information. Experimental results on two benchmark datasets demonstrate that the proposed model substantially outperforms the existing state-of-the-art methods. Our source code is publicly available on GitHub 1 .

Keywords:
Encoder Computer science Encoding (memory) Benchmark (surveying) Decoding methods Source code ENCODE Code (set theory) Artificial intelligence Correlation Soft-decision decoder Deep learning Theoretical computer science Algorithm Programming language

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
64
Refs
0.16
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Speech Recognition and Synthesis
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.