JOURNAL ARTICLE

Patch-Mixing Contrastive Regularization for Few-Label Semi-Supervised Learning

Xiaochang HuXin XuYujun ZengXihong Yang

Year: 2023 Journal:   IEEE Transactions on Artificial Intelligence Vol: 5 (1)Pages: 384-397   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Recently, consistency regularization has become a fundamental component in semi-supervised learning, which tries to make the network's predictions on unlabeled data to be invariant to perturbations. However, its performance decreases drastically when there are scarce labels, e.g., two labels per category. In this article, we analyze the semantic bias problem in consistency regularization for semi-supervised learning and find that this problem stems from imposing consistency regularization on some semantically biased positive sample pairs derived from indispensable data augmentation. Based on the above analysis, we propose a patch-mixing contrastive regularization approach called $p$ -Mix for semi-supervised learning with scarce labels. In $p$ -Mix, the magnitude of semantic bias is estimated by weighting augmented samples in the embedding space. Specifically, the samples are mixed in both sample space and embedding space, respectively, to construct more reliable and task-relevant positive sample pairs. Then, a patch-mixing contrastive objective is designed to indicate the magnitude of semantic bias by utilizing a mixed embedding weighted by virtual soft labels. Extensive experiments were conducted, demonstrating that $p$ -Mix significantly outperforms current state-of-the-art approaches. Especially, $p$ -Mix achieves an accuracy of 91.95% on the CIFAR-10 benchmark with only two labels available for each category, which exceeds the second-best method ICL-SSL by 3.22%.

Keywords:
Notation Regularization (linguistics) Embedding Weighting Mathematics Artificial intelligence Computer science Arithmetic

Metrics

4
Cited By
1.02
FWCI (Field Weighted Citation Impact)
61
Refs
0.75
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Text and Document Classification Technologies
Physical Sciences →  Computer Science →  Artificial Intelligence
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Machine Learning and Data Classification
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Interpolation-Based Contrastive Learning for Few-Label Semi-Supervised Learning

Xihong YangXiaochang HuSihang ZhouXinwang LiuEn Zhu

Journal:   IEEE Transactions on Neural Networks and Learning Systems Year: 2022 Vol: 35 (2)Pages: 2054-2065
JOURNAL ARTICLE

Contrastive Regularization for Semi-Supervised Learning

Doyup LeeSungwoong KimIldoo KimYeongjae CheonMinsu ChoWook-Shin Han

Journal:   2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) Year: 2022 Pages: 3910-3919
JOURNAL ARTICLE

CoMatch: Semi-supervised Learning with Contrastive Graph Regularization

Junnan LiCaiming XiongSteven C. H. Hoi

Journal:   2021 IEEE/CVF International Conference on Computer Vision (ICCV) Year: 2021 Pages: 9455-9464
JOURNAL ARTICLE

Few-Shot Text Classification via Semi-Supervised Contrastive Learning

Fei WangLong ChenFei XieCai XuGuangyue Lu

Journal:   2022 4th International Conference on Natural Language Processing (ICNLP) Year: 2022 Pages: 426-433
© 2026 ScienceGate Book Chapters — All rights reserved.