JOURNAL ARTICLE

Hierarchical Self-Attention Hybrid Sparse Networks for Document Classification

Weichun HuangZiqiang TaoXiaohui HuangLiyan XiongJia Yuan Yu

Year: 2021 Journal:   Mathematical Problems in Engineering Vol: 2021 Pages: 1-10   Publisher: Hindawi Publishing Corporation

Abstract

Document classification is a fundamental problem in natural language processing. Deep learning has demonstrated great success in this task. However, most existing models do not involve the sentence structure as a text semantic feature in the architecture and pay less attention to the contexting importance of words and sentences. In this paper, we present a new model based on a sparse recurrent neural network and self-attention mechanism for document classification. Subsequently, we analyze three variant models of GRU and LSTM for evaluating the sparse model in different datasets. Extensive experiments demonstrate that our model obtains competitive performance and outperforms previous models.

Keywords:
Computer science Artificial intelligence Task (project management) Sentence Feature (linguistics) Document classification Natural language processing Machine learning Linguistics Engineering

Metrics

4
Cited By
0.42
FWCI (Field Weighted Citation Impact)
38
Refs
0.67
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Text and Document Classification Technologies
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Sentiment Analysis and Opinion Mining
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.