JOURNAL ARTICLE

Distributed Hierarchical Sentence Embeddings for Unsupervised Extractive Text Summarization

Abstract

Unsupervised text summarization is a promising approach that avoids human efforts in generating reference summaries, which is particularly important for large-scale datasets. To improve its performance, we propose a hierarchical BERT [1] model that contains both word-level and sentence-level training processes to achieve semantic-rich sentence embeddings. We use the vanilla BERT as the word-level training, and redesign it for the sentence-level training with the new "Sentence Token Prediction" and "Local Shuffle Recovery" training tasks and suitable input format. We first train word-level model to get preliminary sentence embeddings, then we input them into the sentence-level model to further extract higher level and inter-sentence semantic information. After that, we obtain the context sensitive sentence embeddings and utilize them for the KMeans cluster algorithm to finally generate summaries by extracting sentences from the document. To accelerate the training of the BERT model, we adopt the PipeDream [2] model parallelism that distributes the model layers among multiple machines to conduct the training process in parallel. Finally, we show through experimental results that our proposed model outperforms most popular models and achieves a speedup of 2.7 in training time on 4 machines.

Keywords:
Automatic summarization Computer science Sentence Natural language processing Artificial intelligence Information retrieval

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
11
Refs
0.14
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Text Analysis Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

BOOK-CHAPTER

Unsupervised Extractive Text Summarization Using Frequency-Based Sentence Clustering

Ali HajjarJoe Tekli

Communications in computer and information science Year: 2022 Pages: 245-255
BOOK-CHAPTER

A Comparative Study of Sentence Embeddings for Unsupervised Extractive Multi-document Summarization

Salima LamsiyahChristoph Schommer

Communications in computer and information science Year: 2023 Pages: 78-95
JOURNAL ARTICLE

Unsupervised hierarchical text summarization

S. DivyaN. Sripriya

Journal:   AIP conference proceedings Year: 2022 Vol: 2686 Pages: 060006-060006
© 2026 ScienceGate Book Chapters — All rights reserved.