Because self-attention captures long-distance features without generating sequence dependence, it has been widely used in machine translation and machine reading comprehension. However, there is a lack of research and exploration on self-attention in text classification research methods. This paper proposes a classification model based on Self-Attention, which applies the self-attention mechanism to text classification tasks reasonably. In addition, based on the self-attention model, we further propose a convolutional neural networks(CNN) based on the double-head attention mechanism. The experimental results show that the double-head attention-based convolutional neural networks (DHACNN) improves the classification accuracy and optimizes the test speed.
Ming HaoBo XuJingyi LiangBowen ZhangXu-Cheng Yin