JOURNAL ARTICLE

Commonsense Knowledge Base Completion with Relational Graph Attention Network and Pre-trained Language Model

Jinhao JuDeqing YangJingping Liu

Year: 2022 Journal:   Proceedings of the 31st ACM International Conference on Information & Knowledge Management Pages: 4104-4108

Abstract

Many commonsense knowledge graphs (CKGs) still suffer from incompleteness although they have been applied in many natural language processing tasks successfully. Due to the scale and sparsity of CKGs, existing knowledge base completion models are not still competent for CKGs. In this paper, we propose a commonsense knowledge base completion (CKBC) model which learns the structural representations and contextual representations of CKG nodes and relations, respectively by a relational graph attention network and a pre-trained language model. Based on these two types of representations, the scoring decoder in our model achieves a more accurate prediction for a given triple. Our empirical studies on the representative CKG ConceptNet demonstrate our model's superiority over the state-of-the-art CKBC models.

Keywords:
Computer science Commonsense knowledge Knowledge base Knowledge graph Artificial intelligence Graph Natural language processing Language model Natural language understanding Question answering Base (topology) Relational model Natural language Machine learning Theoretical computer science Relational database Information retrieval

Metrics

10
Cited By
1.18
FWCI (Field Weighted Citation Impact)
16
Refs
0.79
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Data Quality and Management
Social Sciences →  Decision Sciences →  Management Science and Operations Research
© 2026 ScienceGate Book Chapters — All rights reserved.