JOURNAL ARTICLE

Efficient Knowledge Graph Construction with Pre-trained Language Models

Ningyu Zhang

Year: 2023 Journal:   Zenodo (CERN European Organization for Nuclear Research)   Publisher: European Organization for Nuclear Research

Abstract

Constructing knowledge graphs (KGs) is essential for various natural language understanding tasks, such as question answering, information extraction, and recommendation systems. However, the process of building KGs is typically time-consuming and labor-intensive. This talk introduces technologies and tools to efficiently construct KGs using pre-trained language models (PLMs).

Keywords:
Knowledge graph Construct (python library) Natural language Graph Question answering Natural language understanding Process (computing) Universal Networking Language Knowledge base Language model

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.28
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

Efficient Knowledge Graph Construction with Pre-trained Language Models

Ningyu Zhang

Journal:   Zenodo (CERN European Organization for Nuclear Research) Year: 2023
JOURNAL ARTICLE

Efficient Knowledge Graph Construction with Pre-trained Language Models

Ningyu Zhang

Journal:   Zenodo (CERN European Organization for Nuclear Research) Year: 2023
JOURNAL ARTICLE

SimKGC: Simple Contrastive Knowledge Graph Completion with Pre-trained Language Models

Liang WangWei ZhaoZhuoyu WeiJingming Liu

Journal:   Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) Year: 2022 Pages: 4281-4294
© 2026 ScienceGate Book Chapters — All rights reserved.