Songlin SunQing SunKevin ZhouTengchao Lv
Most of the current effective methods for text classification task are based on large-scale labeled data and a great number of parameters, but when the supervised training data are few and difficult to be collected, these models are not available.In this paper, we propose a hierarchical attention prototypical networks (HAPN) for few-shot text classification.We design the feature level, word level, and instance level multi cross attention for our model to enhance the expressive ability of semantic space.We verify the effectiveness of our model on two standard benchmark fewshot text classification datasets -FewRel and CSID, and achieve the state-of-the-art performance.The visualization of hierarchical attention layers illustrates that our model can capture more important features, words, and instances separately.In addition, our attention mechanism increases support set augmentability and accelerates convergence speed in the training stage.
Songlin SunQing SunKevin ZhouTengchao Lv
Satchidanand KshetrimayumYo‐Ping Huang
Tianyu GaoXu HanZhiyuan LiuMaosong Sun