JOURNAL ARTICLE

Abstractive Arabic Text Summarization Based on Deep Learning

Yaser M. WazeryMarwa E. SalehAbdullah AlharbiAbdelmgeid A. Ali

Year: 2022 Journal:   Computational Intelligence and Neuroscience Vol: 2022 Pages: 1-14   Publisher: Hindawi Publishing Corporation

Abstract

Text summarization (TS) is considered one of the most difficult tasks in natural language processing (NLP). It is one of the most important challenges that stand against the modern computer system’s capabilities with all its new improvement. Many papers and research studies address this task in literature but are being carried out in extractive summarization, and few of them are being carried out in abstractive summarization, especially in the Arabic language due to its complexity. In this paper, an abstractive Arabic text summarization system is proposed, based on a sequence-to-sequence model. This model works through two components, encoder and decoder. Our aim is to develop the sequence-to-sequence model using several deep artificial neural networks to investigate which of them achieves the best performance. Different layers of Gated Recurrent Units (GRU), Long Short-Term Memory (LSTM), and Bidirectional Long Short-Term Memory (BiLSTM) have been used to develop the encoder and the decoder. In addition, the global attention mechanism has been used because it provides better results than the local attention mechanism. Furthermore, AraBERT preprocess has been applied in the data preprocessing stage that helps the model to understand the Arabic words and achieves state-of-the-art results. Moreover, a comparison between the skip-gram and the continuous bag of words (CBOW) word2Vec word embedding models has been made. We have built these models using the Keras library and run-on Google Colab Jupiter notebook to run seamlessly. Finally, the proposed system is evaluated through ROUGE-1, ROUGE-2, ROUGE-L, and BLEU evaluation metrics. The experimental results show that three layers of BiLSTM hidden states at the encoder achieve the best performance. In addition, our proposed system outperforms the other latest research studies. Also, the results show that abstractive summarization models that use the skip-gram word2Vec model outperform the models that use the CBOW word2Vec model.

Keywords:
Automatic summarization Computer science Artificial intelligence Natural language processing Deep learning Preprocessor Word embedding Encoder Word2vec Word (group theory) Language model Sequence (biology) Lexicon Embedding

Metrics

68
Cited By
13.31
FWCI (Field Weighted Citation Impact)
31
Refs
0.98
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Text Analysis Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Abstractive Text Summarization using Deep Learning

Rishank TambeDisha ThaokarEshika PachgharePrachi SawanePranay MehendolePriti Kakde

Journal:   International Journal for Research in Applied Science and Engineering Technology Year: 2023 Vol: 11 (3)Pages: 68-72
JOURNAL ARTICLE

Recent Trends in Deep Learning Based Abstractive Text Summarization

Neha RaneSharvari Govilkar

Journal:   International Journal of Recent Technology and Engineering (IJRTE) Year: 2019 Vol: 8 (3)Pages: 3108-3115
JOURNAL ARTICLE

Abstractive text summarization using LSTM-CNN based deep learning

Shengli SongHaitao HuangTongxiao Ruan

Journal:   Multimedia Tools and Applications Year: 2018 Vol: 78 (1)Pages: 857-875
© 2026 ScienceGate Book Chapters — All rights reserved.