JOURNAL ARTICLE

Extractive text summarization based on comparative learning

Abstract

In the digital age, faced with a large amount of information, people usually need to sift through it to find content that meets their personal needs. And by applying the text summarization model, the information acquisition efficiency can be effectively improved and the required information can be quickly captured. Most of the current extractive summaries, characterize the sentences in the original document and construct a relationship model between the sentences, and ultimately select multiple sentences with higher scores to form a summary, but this often leads to a model that is more inclined to select highly generalized sentences and ignores the coupling between multiple sentences. So when characterizing the sentences in a document, it is necessary that the characterization vectors can express the semantic information as accurately as possible. In this paper, we propose a SimBERTSUM summarization framework based on contrast learning, which improves the quality of summaries by introducing contrast learning and then effectively learning the feature information of data. The model is also validated using real data, which has a wide range of application prospects and important significance in the field of natural language processing.

Keywords:
Automatic summarization Computer science Artificial intelligence Natural language processing Construct (python library) Contrast (vision) Field (mathematics) Feature (linguistics) Information retrieval Feature extraction Natural language

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
5
Refs
0.23
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Text Analysis Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.