JOURNAL ARTICLE

DIONYSUS: A Pre-trained Model for Low-Resource Dialogue Summarization

Abstract

Dialogue summarization has recently garnered significant attention due to its wide range of applications. However, existing methods for summarizing dialogues have limitations because they do not take into account the inherent structure of dialogue and rely heavily on labeled data, which can lead to poor performance in new domains. In this work, we propose DIONYSUS (dynamic input optimization in pre-training for dialogue summarization), a pre-trained encoder-decoder model for summarizing dialogues in any new domain. To pre-train DIONYSUS, we create two pseudo summaries for each dialogue example: one from a fine-tuned summarization model and the other from important dialogue turns. We then choose one of these pseudo summaries based on information distribution differences in different types of dialogues. This selected pseudo summary serves as the objective for pre-training DIONYSUS using a self-supervised approach on a large dialogue corpus. Our experiments show that DIONYSUS outperforms existing methods on six datasets, as demonstrated by its ROUGE scores in zero-shot and few-shot settings

Keywords:
Automatic summarization Computer science Domain (mathematical analysis) Encoder Artificial intelligence Natural language processing Range (aeronautics) Training set Resource (disambiguation) Information retrieval

Metrics

3
Cited By
0.77
FWCI (Field Weighted Citation Impact)
49
Refs
0.72
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Speech and dialogue systems
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Low Resource Summarization using Pre-trained Language Models

Mubashir MunafHammad AfzalKhawir MahmoodNaima Iltaf

Journal:   ACM Transactions on Asian and Low-Resource Language Information Processing Year: 2024 Vol: 23 (10)Pages: 1-19
JOURNAL ARTICLE

DialogLM: Pre-trained Model for Long Dialogue Understanding and Summarization

Ming ZhongYang LiuYichong XuChenguang ZhuMichael Zeng

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2022 Vol: 36 (10)Pages: 11765-11773
JOURNAL ARTICLE

Data Augmentation for Low-Resource Dialogue Summarization

Yongtai LiuJoshua MaynezGonçalo SimõesShashi Narayan

Journal:   Findings of the Association for Computational Linguistics: NAACL 2022 Year: 2022 Pages: 703-710
© 2026 ScienceGate Book Chapters — All rights reserved.