Text summarization is essential in natural language processing due to the exponential growth of textual data. Extractive methods select salient sentences but may produce incoherent summaries, while abstractive methods generate fluent summaries but risk losing key information. This paper proposes a hybrid approach, combining BERT-based extractive summarization with T5-based abstractive summarization, capturing both informativeness and coherence. The proposed framework is evaluated on CNN/DailyMail and XSum datasets, demonstrating superior performance in ROUGE and BLEU metrics compared to individual extractive or abstractive models
G. RameshVamsi ManyamVijoosh MandulaPavan MyanaSharath Chandra MachaS. Sravani Reddy
Mostafa MagdyAbdelrahman M. AbdelbakyEnsaf Hussein Mohamed
Priyadarshini PatilChandan RaoG.V. Rithin Kumar ReddyRiteesh RamS. M. Meena
Yuuki IwasakiAkihiro YamashitaYoko KonnoKatsushi Matsubayashi
Yuuki IwasakiAkihiro YamashitaYoko KonnoKatsushi Matsubayashi