JOURNAL ARTICLE

A Scalable Attention Mechanism Based Neural Network for Text Classification

Jianyun ZhengJianmin PangXiaochuan ZhangDi SunXin ZhouKai ZhangDong WangMingLiang LiJun Wang

Year: 2020 Journal:   Journal of Physics Conference Series Vol: 1486 (2)Pages: 022019-022019   Publisher: IOP Publishing

Abstract

Abstract In general, deep learning based text classification methods are considered to be effective but tend to be relatively slow especially for model training. In this work, we present a powerful, so-called “scalable attention mechanism”, which performs better than conventional attention mechanism in terms of both effectiveness and the speed of model training. Based on the scalable attention mechanism, we propose a neural network for text classification. The experimental results on eight representative datasets show that our method can obtain similar accuracy to state-of-the-art methods with training in less than 4 minutes on an NVIDIA GTX 1080Ti GPU. To the best of our knowledge, our method is at least twice faster than all the published deep learning classifiers.

Keywords:
Computer science Scalability Mechanism (biology) Artificial intelligence Artificial neural network Machine learning Deep learning Deep neural networks Database

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
5
Refs
0.03
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Handwritten Text Recognition Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Text and Document Classification Technologies
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.