JOURNAL ARTICLE

Knowledge-Grounded Dialogue Generation with Pre-trained Language Models

Abstract

We study knowledge-grounded dialogue generation with pre-trained language models.To leverage the redundant external knowledge under capacity constraint, we propose equipping response generation defined by a pretrained language model with a knowledge selection module, and an unsupervised approach to jointly optimizing knowledge selection and response generation with unlabeled dialogues.Empirical results on two benchmarks indicate that our model can significantly outperform state-of-the-art methods in both automatic evaluation and human judgment.

Keywords:
Leverage (statistics) Selection (genetic algorithm) Language model Language understanding Language acquisition Natural language generation Natural language

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.33
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Speech and dialogue systems
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.