JOURNAL ARTICLE

A Hierarchical Neural Abstractive Summarization with Self-Attention Mechanism

Weijun YangZhi-Cheng TangXinhuai Tang

Year: 2018 Journal:   Proceedings of the 2018 3rd International Conference on Automation, Mechanical Control and Computational Engineering (AMCCE 2018)

Abstract

Recently, the attentional seq2seq model had made a remarkable progress on the abstractive summarization.But most of these models do not considers the relation between original sentences, which is the important feature in extractive method.In this work, we proposed a Hierarchical Neural model to address problem.First, we use a self-attention to discovers the relation between original sentences.Secondly, we use a copy mechanism to solve the OOV problem.The experiment demonstrates that our model achieves state-of-the-art ROUGE scores on LCSTS dataset.

Keywords:
Automatic summarization Computer science Mechanism (biology) Artificial intelligence Natural language processing

Metrics

7
Cited By
0.42
FWCI (Field Weighted Citation Impact)
14
Refs
0.65
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Text Analysis Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.