JOURNAL ARTICLE

Prompt-Contrastive Learning for Zero-Shot Relation Extraction

Xueyi ZhongLiye ZhaoLicheng PengGuodong YangKun HuWansen Wu

Year: 2026 Journal:   Entropy Vol: 28 (1)Pages: 69-69   Publisher: Multidisciplinary Digital Publishing Institute

Abstract

Relation extraction serves as an essential task for knowledge acquisition and management, defined as determining the relation between two annotated entities from a piece of text. Over recent years, zero-shot learning has been introduced to train relation extraction models due to the expensive cost of incessantly annotating emerging relations. Current methods endeavor to transfer knowledge of seen relations into predictions of unseen relations by conducting relation extraction through different tasks. Nonetheless, the divergence in task formulations prevents relation extraction models from acquiring informative semantic representations, resulting in inferior performance. In this paper, we strive to exploit the relational knowledge contained in pre-trained language models, which may generate enlightening information for the representation of unseen relations from seen relations. To this end, we investigate a Prompt-Contrastive learning perspective for Relation Extraction under a zero-shot setting, namely PCRE. To be specific, based on leveraging semantic knowledge from pre-trained language models with prompt tuning, we augment each instance with different prompt templates to construct two views for an instance-level contrastive objective. Additionally, we devise an instance-description contrastive objective to elicit relational knowledge from relation descriptions. With joint optimization, the relation extraction model can learn how to separate relations. The experimental results show our PCRE method outperforms state-of-the-art baselines in zero-shot relation extraction. The further extensive analysis verifies that our proposal is robust in different datasets, the number of seen relations, and the number of training instances.

Keywords:
Relationship extraction Relation (database) Divergence (linguistics) Exploit Task (project management) Information extraction Representation (politics) Knowledge acquisition Construct (python library)

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
27
Refs
0.31
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

RCL: Relation Contrastive Learning for Zero-Shot Relation Extraction

Shusen WangBosen ZhangYajing XuYanan WuBo Xiao

Journal:   Findings of the Association for Computational Linguistics: NAACL 2022 Year: 2022 Pages: 2456-2468
JOURNAL ARTICLE

Dynamic Prompt-Driven Zero-Shot Relation Extraction

Liang XuXiaoxuan BuXuetao Tian

Journal:   IEEE/ACM Transactions on Audio Speech and Language Processing Year: 2024 Vol: 32 Pages: 2900-2912
© 2026 ScienceGate Book Chapters — All rights reserved.