JOURNAL ARTICLE

Attention based Recurrent Neural Network for Nepali Text Summarization

Bipin TimalsinaNawaraj PaudelTej Bahadur Shahi

Year: 2022 Journal:   Journal of Institute of Science and Technology Vol: 27 (1)Pages: 141-148

Abstract

Automatic text summarization has been a challenging topic in natural language processing (NLP) as it demands preserving important information while summarizing the large text into a summary. Extractive and abstractive text summarization are widely investigated approaches for text summarization. In extractive summarization, the important sentence from the large text is extracted and combined to create a summary whereas abstractive summarization creates a summary that is more focused on meaning, rather than content. Therefore, abstractive summarization gained more attention from researchers in the recent past. However, text summarization is still an untouched topic in the Nepali language. To this end, we proposed an abstractive text summarization for Nepali text. Here, we, first, create a Nepali text dataset by scraping Nepali news from the online news portals. Second, we design a deep learning-based text summarization model based on an encoder-decoder recurrent neural network with attention. More precisely, Long Short-Term Memory (LSTM) cells are used in the encoder and decoder layer. Third, we build nine different models by selecting various hyper-parameters such as the number of hidden layers and the number of nodes. Finally, we report the Recall-Oriented Understudy for Gisting Evaluation (ROUGE) score for each model to evaluate their performance. Among nine different models created by adjusting different numbers of layers and hidden states, the model with a single-layer encoder and 256 hidden states outperformed all other models with F-Score values of 15.74, 3.29, and 15.21 for ROUGE-1 ROUGE-2 and ROUGE-L, respectively.

Keywords:
Automatic summarization Computer science Nepali Natural language processing Artificial intelligence Sentence Recurrent neural network Encoder Deep learning Multi-document summarization Information retrieval Text graph Artificial neural network Linguistics

Metrics

2
Cited By
0.39
FWCI (Field Weighted Citation Impact)
39
Refs
0.61
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Text Analysis Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Text summarization using residual-based temporal attention convolutional neural network

Reshmi P. RajanDeepa Jose

Journal:   International Journal of Information Technology Year: 2023
JOURNAL ARTICLE

Text Summarization Method Based on Gated Attention Graph Neural Network

Jingui HuangWenya WuJingyi LiShengchun Wang

Journal:   Sensors Year: 2023 Vol: 23 (3)Pages: 1654-1654
JOURNAL ARTICLE

Text Recommendation Based on Heterogeneous Attention Recurrent Neural Network

NIU Yaoqiang, MENG Yuyu, NIU Quanfu

Journal:   DOAJ (DOAJ: Directory of Open Access Journals) Year: 2020
JOURNAL ARTICLE

English to Nepali Sentence Translation Using Recurrent Neural Network with Attention

Kriti NemkulSubarna Shakya

Journal:   2021 International Conference on Computing, Communication, and Intelligent Systems (ICCCIS) Year: 2021 Vol: 521 Pages: 607-611
© 2026 ScienceGate Book Chapters — All rights reserved.