JOURNAL ARTICLE

Improving Abstractive Text Summarization with History Aggregation

Abstract

Recent neural sequence to sequence models have provided feasible solutions for abstractive summarization. However, such models are still hard to tackle long text dependency in the summarization task. A high-quality summarization system usually depends on strong encoder which can refine important information from long input texts so that the decoder can generate salient summaries from the encoder's memory. In this paper, we propose an aggregation mechanism based on the Transformer model to address the challenge of long text representation. Our model can review history information to make encoder hold more memory capacity. Empirically, we apply our aggregation mechanism to the Transformer model and experiment on CNN/DailyMail dataset to achieve higher quality summaries compared to several strong baseline models on the ROUGE metrics.

Keywords:
Automatic summarization Computer science Encoder Transformer Artificial intelligence Natural language processing Salient Dependency (UML) Sequence (biology) Representation (politics)

Metrics

12
Cited By
1.03
FWCI (Field Weighted Citation Impact)
68
Refs
0.81
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Text Analysis Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Abstractive text summarization

Γιδιώτης, Αλέξιος Γεωργίου

Journal:   Aristotle University of Thessaloniki Year: 2019
JOURNAL ARTICLE

Abstractive Text Summarization

Journal:   Journal of Xidian University Year: 2020 Vol: 14 (6)
DISSERTATION

Toward abstractive text summarization

Shafieibavani, Elaheh

University:   UNSWorks (University of New South Wales, Sydney, Australia) Year: 2019
© 2026 ScienceGate Book Chapters — All rights reserved.