JOURNAL ARTICLE

Applying Siamese Hierarchical Attention Neural Networks for multi-document summarization.

José Ángel González-BarbaJulien DeloncaEmilio Sanchís ArnalFernando GarcíaEncarna Segarra

Year: 2019 Journal:   RiuNet (Universitat Politècnica de València) Vol: 63 (63)Pages: 111-118   Publisher: Universitat Politècnica de València

Abstract

[EN] In this paper, we present an approach to multi-document summarization based on Siamese Hierarchical Attention Neural Networks. The attention mechanism of Hierarchical Attention Networks, provides a score to each sentence in function of its relevance in the classification process. For the summarization process, only the scores of sentences are used to rank them and select the most salient sentences. In this work we explore the adaptability of this model to the problem of multi-document summarization (typically very long documents where the straightforward application of neural networks tends to fail). The experiments were carried out using the CNN/DailyMail as training corpus, and the DUC-2007 as test corpus. Despite the difference between training set (CNN/DailyMail) and test set (DUC-2007) characteristics, the results show the adequacy of this approach to multi-document summarization.

Keywords:
Automatic summarization Work (physics) Artificial neural network Computer science Humanities Artificial intelligence Data science Engineering Art

Metrics

1
Cited By
0.15
FWCI (Field Weighted Citation Impact)
0
Refs
0.58
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.