JOURNAL ARTICLE

Discourse-Aware Hierarchical Attention Network for Extractive Single-Document Summarization

Abstract

Discourse relations between sentences are often represented as a tree, and the tree structure provides important information for summarizers to create a short and coherent summary.However, current neural network-based summarizers treat the source document as just a sequence of sentences and ignore the tree-like discourse structure inherent in the document.To incorporate the information of a discourse tree structure into the neural network-based summarizers, we propose a discourse-aware neural extractive summarizer which can explicitly take into account the discourse dependency tree structure of the source document.Our discourseaware summarizer can jointly learn the discourse structure and the salience score of a sentence by using novel hierarchical attention modules, which can be trained on automatically parsed discourse dependency trees.Experimental results showed that our model achieved competitive or better performances against state-of-theart models in terms of ROUGE scores on the DailyMail dataset.We further conducted manual evaluations.The results showed that our approach also gained the coherence of the output summaries.

Keywords:
Automatic summarization Computer science Natural language processing Artificial intelligence Salience (neuroscience) Tree structure Dependency (UML) Sentence Tree (set theory) Parsing Treebank Recurrent neural network Coherence (philosophical gambling strategy) Artificial neural network Data structure

Metrics

11
Cited By
1.08
FWCI (Field Weighted Citation Impact)
39
Refs
0.83
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Text Analysis Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.