JOURNAL ARTICLE

Extractive summarization using siamese hierarchical transformer encoders

José Ángel GonzálezEncarna SegarraFernando GarcíaEmilio SanchísLluís-F. Hurtado

Year: 2020 Journal:   Journal of Intelligent & Fuzzy Systems Vol: 39 (2)Pages: 2409-2419   Publisher: IOS Press

Abstract

In this paper, we present an extractive approach to document summarization, the Siamese Hierarchical Transformer Encoders system, that is based on the use of siamese neural networks and the transformer encoders which are extended in a hierarchical way. The system, trained for binary classification, is able to assign attention scores to each sentence in the document. These scores are used to select the most relevant sentences to build the summary. The main novelty of our proposal is the use of self-attention mechanisms at sentence level for document summarization, instead of using only attentions at word level. The experimentation carried out using the CNN/DailyMail summarization corpus shows promising results in-line with the state-of-the-art.

Keywords:
Automatic summarization Computer science Novelty Transformer Sentence Encoder Artificial intelligence Natural language processing Engineering Voltage

Metrics

5
Cited By
0.59
FWCI (Field Weighted Citation Impact)
19
Refs
0.72
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Text and Document Classification Technologies
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.