JOURNAL ARTICLE

Understanding Negative Sampling in Knowledge Graph Embedding

Jing QianGangmin LiKatie AtkinsonYong Yue

Year: 2021 Journal:   International Journal of Artificial Intelligence & Applications Vol: 12 (1)Pages: 71-81

Abstract

Knowledge graph embedding (KGE) is to project entities and relations of a knowledge graph (KG) into a low-dimensional vector space, which has made steady progress in recent years. Conventional KGE methods, especially translational distance-based models, are trained through discriminating positive samples from negative ones. Most KGs store only positive samples for space efficiency. Negative sampling thus plays a crucial role in encoding triples of a KG. The quality of generated negative samples has a direct impact on the performance of learnt knowledge representation in a myriad of downstream tasks, such as recommendation, link prediction and node classification. We summarize current negative sampling approaches in KGE into three categories, static distribution-based, dynamic distribution-based and custom cluster-based respectively. Based on this categorization we discuss the most prevalent existing approaches and their characteristics. It is a hope that this review can provide some guidelines for new thoughts about negative sampling in KGE.

Keywords:
Embedding Computer science Sampling (signal processing) Categorization Graph Theoretical computer science Knowledge graph Vector space Data mining Machine learning Artificial intelligence Mathematics

Metrics

10
Cited By
1.13
FWCI (Field Weighted Citation Impact)
69
Refs
0.81
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Bayesian Modeling and Causal Inference
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.