Recently, prompt-based learning has shown impressive performance on various natural language processing tasks in few-shot scenarios. The previous study of knowledge probing showed that the success of prompt learning contributes to the implicit knowledge stored in pre-trained language models. However, how this implicit knowledge helps solve downstream tasks remains unclear. In this work, we propose a knowledge-guided prompt learning method that can reveal relevant knowledge for text classification. Specifically, a knowledge prompting template and two multi-task frameworks were designed, respectively. The experiments demonstrated the superiority of combining knowledge and prompt learning in few-shot text classification.
Dianbo SuiYubo ChenBinjie MaoDelai QiuKang LiuJun Zhao
Jiahui LiYuan YangJian SunFen Wang
Jia DuXuanyu ZhangSiyi WangKai WangYanquan ZhouLei LiQing YangDongliang Xu
Enrico ZioMatteo RossiElena GarcíaYE ShuqinGuangwei Zhang