JOURNAL ARTICLE

Prompt-based Zero-shot Text Classification with Conceptual Knowledge

Abstract

In recent years, pre-trained language models have garnered significant attention due to their effectiveness, which stems from the rich knowledge acquired during pre-training. To mitigate the inconsistency issues between pre-training tasks and downstream tasks and to facilitate the resolution of language-related issues, prompt-based approaches have been introduced, which are particularly useful in low-resource scenarios. However, existing approaches mostly rely on verbalizers to translate the predicted vocabulary to task-specific labels. The major limitations of this approach are the ignorance of potentially relevant domain-specific words and being biased by the pre-training data. To address these limitations, we propose a framework that incorporates conceptual knowledge for text classification in the extreme zero-shot setting. The framework includes prompt-based keyword extraction, weight assignment to each prompt keyword, and final representation estimation in the knowledge graph embedding space. We evaluated the method on four widely-used datasets for sentiment analysis and topic detection, demonstrating that it consistently outperforms recently-developed prompt-based approaches in the same experimental settings.

Keywords:
Computer science Vocabulary Artificial intelligence Natural language processing Conceptual graph Machine learning Domain knowledge Task (project management) Embedding Language model Graph Knowledge representation and reasoning Theoretical computer science

Metrics

9
Cited By
2.30
FWCI (Field Weighted Citation Impact)
25
Refs
0.87
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Sentiment Analysis and Opinion Mining
Physical Sciences →  Computer Science →  Artificial Intelligence
Misinformation and Its Impacts
Social Sciences →  Social Sciences →  Sociology and Political Science
© 2026 ScienceGate Book Chapters — All rights reserved.