Abstract

Text summarization is one of the famous problems in natural language processing and deep learning in recent years. Generally, text summarization contains a short note on a large text document. Our main purpose is to create a short, fluent and understandable abstractive summary of a text document. For making a good summarizer we have used amazon fine food reviews dataset, which is available on Kaggle. We have used reviews text descriptions as our input data, and generated a simple summary of that review descriptions as our output. To assist produce some extensive summary, we have used a bi-directional RNN with LSTM's in encoding layer and attention model in decoding layer. And we applied the sequence to sequence model to generate a short summary of food descriptions. There are some challenges when we working with abstractive text summarizer such as text processing, vocabulary counting, missing word counting, word embedding, the efficiency of the model or reduce value of loss and response machine fluent summary. In this paper, the main goal was increased the efficiency and reduce train loss of sequence to sequence model for making a better abstractive text summarizer. In our experiment, we've successfully reduced the training loss with a value of 0.036 and our abstractive text summarizer able to create a short summary of English to English text.

Keywords:
Automatic summarization Computer science Natural language processing Artificial intelligence Sequence (biology) Word (group theory) Recurrent neural network Word embedding Vocabulary Text processing Decoding methods Deep learning Information retrieval Artificial neural network Embedding Linguistics

Metrics

44
Cited By
4.15
FWCI (Field Weighted Citation Impact)
8
Refs
0.95
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Text Analysis Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

BOOK-CHAPTER

Abstractive Text Summarization Methods with Sequence to Sequence RNN

Ms Pallavi A WaniArchana Gulati

Lecture notes in networks and systems Year: 2025 Pages: 119-129
JOURNAL ARTICLE

Neural Abstractive Text Summarization with Sequence-to-Sequence Models

Tian ShiYaser KeneshlooNaren RamakrishnanChandan K. Reddy

Journal:   ACM/IMS Transactions on Data Science Year: 2021 Vol: 2 (1)Pages: 1-37
JOURNAL ARTICLE

Text Summarization in Assamese Language using Sequence to Sequence RNNs

Pritom Jyoti GoutomNomi Baruah

Journal:   Indian Journal of Science and Technology Year: 2023 Vol: 16 (SP2)Pages: 22-29
JOURNAL ARTICLE

Turkish abstractive text summarization using pretrained sequence-to-sequence models

Batuhan BaykaraTunga Güngör

Journal:   Natural Language Engineering Year: 2022 Vol: 29 (5)Pages: 1275-1304
© 2026 ScienceGate Book Chapters — All rights reserved.