JOURNAL ARTICLE

Ontology-enhanced Prompt-tuning for Few-shot Learning

Abstract

Few-shot Learning (FSL) is aimed to make predictions based on a limited number of samples. Structured data such as knowledge graphs and ontology libraries has been leveraged to benefit the few-shot setting in various tasks. However, the priors adopted by the existing methods suffer from challenging knowledge missing, knowledge noise, and knowledge heterogeneity, which hinder the performance for few-shot learning. In this study, we explore knowledge injection for FSL with pre-trained language models and propose ontology-enhanced prompt-tuning (OntoPrompt). Specifically, we develop the ontology transformation based on the external knowledge graph to address the knowledge missing issue, which fulfills and converts structure knowledge to text. We further introduce span-sensitive knowledge injection via a visible matrix to select informative knowledge to handle the knowledge noise issue. To bridge the gap between knowledge and text, we propose a collective training algorithm to optimize representations jointly. We evaluate our proposed OntoPrompt in three tasks, including relation extraction, event extraction, and knowledge graph completion, with eight datasets. Experimental results demonstrate that our approach can obtain better few-shot performance than baselines.

Keywords:
Computer science Ontology Knowledge extraction Open Knowledge Base Connectivity Bridge (graph theory) Knowledge graph Domain knowledge Graph Knowledge base Information retrieval Artificial intelligence Machine learning Knowledge management Theoretical computer science Personal knowledge management

Metrics

59
Cited By
6.94
FWCI (Field Weighted Citation Impact)
74
Refs
0.97
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Knowledge Contrast-Enhanced Continuous Prompt Tuning for few-shot learning

Fei LiYouzhi HuangYanyan WangZhengyi ChenYin XuXiangyang Li

Journal:   Neural Computing and Applications Year: 2025 Vol: 37 (19)Pages: 14151-14169
JOURNAL ARTICLE

PPT: Pre-trained Prompt Tuning for Few-shot Learning

Yuxian GuXu HanZhiyuan LiuMinlie Huang

Journal:   Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) Year: 2022 Pages: 8410-8423
JOURNAL ARTICLE

Dual Context-Guided Continuous Prompt Tuning for Few-Shot Learning

Jie ZhouLe TianHoujin YuZhou XiaoHui SuJie Zhou

Journal:   Findings of the Association for Computational Linguistics: ACL 2022 Year: 2022 Pages: 79-84
© 2026 ScienceGate Book Chapters — All rights reserved.