JOURNAL ARTICLE

A Contrastive Framework to Enhance Unsupervised Sentence Representation Learning

Abstract

Recently, significant progress has been made in generating high-quality sentence representations through contrastive learning. SimCSE-like models improve the uniformity of the representation space by pulling in positive examples and pushing out negative examples. However, these models often suffer from semantic monotonicity, sampling bias, and training effect dependent on batch size. In order to solve these problems, this paper proposes a contrastive framework (CEUR) to enhance unsupervised sentence representation learning. CEUR adopts a linguistic knowledge-based sample augmentation method. Positive samples are generated by the method of synonym repetition, and negative samples are generated by the method of antonym replacement. To improve the consistency of representation space, CEUR uses an instance weighting method to reduce sampling bias. Going a step further, CEUR uses momentum contrast to increase the number of trainable negative samples. Extensive experimental results show that CEUR outperforms existing baseline models in comprehensive performance on seven semantic text similarity tasks.

Keywords:
Computer science Sentence Artificial intelligence Natural language processing Weighting Representation (politics) Contrast (vision) Similarity (geometry) Consistency (knowledge bases) Sampling (signal processing) Metric (unit) Repetition (rhetorical device) Synonym (taxonomy) Machine learning Linguistics Image (mathematics)

Metrics

1
Cited By
0.26
FWCI (Field Weighted Citation Impact)
32
Refs
0.53
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

Unsupervised Sentence Representation via Contrastive Learning with Mixing Negatives

Yanzhao ZhangRichong ZhangSamuel MensahXudong LiuYongyi Mao

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2022 Vol: 36 (10)Pages: 11730-11738
JOURNAL ARTICLE

Kalman contrastive unsupervised representation learning

Mohammad Mahdi Jahani Yekta

Journal:   Scientific Reports Year: 2024 Vol: 14 (1)Pages: 30243-30243
JOURNAL ARTICLE

Debiased Contrastive Learning of Unsupervised Sentence Representations

Kun ZhouBeichen ZhangXin ZhaoJi-Rong Wen

Journal:   Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) Year: 2022
© 2026 ScienceGate Book Chapters — All rights reserved.