Sherilyn KevinSatish MishraSiddhi Sharma
Today, millions of data are generated every hour, which highlights the need for summarizing all this data accurately and efficiently. Doing such a task manually is tedious. This welcomes the need for automatic summarizing techniques. Generating precise and concise summaries of long text data is a necessity. Automatic summarization includes two primary techniques- Extractive and Abstractive Summarization. Extractive Summarization uses important sentences and keywords to construct the summary whereas abstractive summarization understands the text and generates a summary. The encoder-decoder architecture is generally used for abstractive summarization. This study briefs about various transformer architectures, including T5, BART, and Pegasus. Furthermore, a comparative analysis of these models on the same data is presented and the result of the same is compared on scores with the manually generated summaries- ROUGE1, ROUGE2, and ROUGEL. The purpose of this study is to understand the advancement of abstractive text summarization models as well as to understand the strategies and their usefulness.
Saurabh VaradeEjaaz SayyedVaibhavi NagtodeShilpa Shinde
Mahira KirmaniGagandeep KaurMudasir Mohd
Debjyoti GhoshAbhirup MazumderSainik Kumar Mahata
Hilário OliveiraRafael Dueire Lins