JOURNAL ARTICLE

JGCL: Joint Self-Supervised and Supervised Graph Contrastive Learning

Selahattin AkkasAriful Azad

Year: 2022 Journal:   Companion Proceedings of the Web Conference 2022 Pages: 1099-1105

Abstract

Semi-supervised and self-supervised learning on graphs are two popular avenues for graph representation learning. We demonstrate that no single method from semi-supervised and self-supervised learning works uniformly well for all settings in the node classification task. Self-supervised methods generally work well with very limited training data, but their performance could be further improved using the limited label information. We propose a joint self-supervised and supervised graph contrastive learning (JGCL) to capture the mutual benefits of both learning strategies. JGCL utilizes both supervised and self-supervised data augmentation and a joint contrastive loss function. Our experiments demonstrate that JGCL and its variants are one of the best performers across various proportions of labeled data when compared with state-of-the-art self-supervised, unsupervised, and semi-supervised methods on various benchmark graphs.

Keywords:
Semi-supervised learning Computer science Artificial intelligence Supervised learning Machine learning Graph Pattern recognition (psychology) Benchmark (surveying) Unsupervised learning Feature learning Artificial neural network Theoretical computer science

Metrics

5
Cited By
0.59
FWCI (Field Weighted Citation Impact)
26
Refs
0.64
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence
Epigenetics and DNA Methylation
Life Sciences →  Biochemistry, Genetics and Molecular Biology →  Molecular Biology
Recommender Systems and Techniques
Physical Sciences →  Computer Science →  Information Systems
© 2026 ScienceGate Book Chapters — All rights reserved.