JOURNAL ARTICLE

Double-Head Attention-Based Convolutional Neural Networks for Text Classification

Huaye ShiJianping LiYimou Xu

Year: 2019 Journal:   2019 International Joint Conference on Information, Media and Engineering (IJCIME) Vol: 12 Pages: 27-31

Abstract

Because self-attention captures long-distance features without generating sequence dependence, it has been widely used in machine translation and machine reading comprehension. However, there is a lack of research and exploration on self-attention in text classification research methods. This paper proposes a classification model based on Self-Attention, which applies the self-attention mechanism to text classification tasks reasonably. In addition, based on the self-attention model, we further propose a convolutional neural networks(CNN) based on the double-head attention mechanism. The experimental results show that the double-head attention-based convolutional neural networks (DHACNN) improves the classification accuracy and optimizes the test speed.

Keywords:
Computer science Convolutional neural network Machine translation Artificial intelligence Mechanism (biology) Head (geology) Machine learning Natural language processing Pattern recognition (psychology)

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
24
Refs
0.25
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Advanced Text Analysis Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Text and Document Classification Technologies
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.