JOURNAL ARTICLE

Towards Zero-Shot Persona Dialogue Generation with In-Context Learning

Abstract

Much work has been done to improve persona consistency by finetuning a pretrained dialogue model on high-quality human-annoated persona datasets. However, these methods still face the challenges of high cost and poor scalability. To this end, we propose a simple-yet-effective approach to significantly improve zero-shot persona consistency via in-context learning. Specifically, we first pre-train a persona-augmented dialogue generation model and then utilize in-context prompting mechanism to realize zero-shot persona customization. Experimental results demonstrate that our method can dramatically improve persona consistency without compromising coherence and informativeness in zero-shot settings.

Keywords:
Persona Computer science Personalization Context (archaeology) Consistency (knowledge bases) Scalability Zero (linguistics) Artificial intelligence Human–computer interaction World Wide Web Linguistics Database

Metrics

2
Cited By
0.49
FWCI (Field Weighted Citation Impact)
26
Refs
0.59
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Persona Design and Applications
Physical Sciences →  Computer Science →  Human-Computer Interaction
Innovative Human-Technology Interaction
Physical Sciences →  Computer Science →  Human-Computer Interaction
Technology Use by Older Adults
Social Sciences →  Social Sciences →  Demography
© 2026 ScienceGate Book Chapters — All rights reserved.