JOURNAL ARTICLE

Abstractive Event Summarization on Twitter

Quanzhi LiQiong Zhang

Year: 2020 Journal:   Companion Proceedings of the Web Conference 2020

Abstract

This paper presents a new approach for automatically summarizing a social media event. It utilizes the BERT model as the encoder and a Transformer architecture as the decoder. The framework also includes an event topic prediction component, and the predicted event topic will help the decoder focus more on the specific aspects of the topic category when generating summary. To make the summary more succinct and coherent, the most important messages from an event cluster are selected by a message selection model and encoded by the BERT model. Our preliminary experiment shows that our approach outperforms the baseline methods.

Keywords:
Automatic summarization Computer science Event (particle physics) Focus (optics) Encoder Social media Transformer Component (thermodynamics) Natural language processing Artificial intelligence Architecture Selection (genetic algorithm) Topic model Information retrieval World Wide Web

Metrics

15
Cited By
1.32
FWCI (Field Weighted Citation Impact)
3
Refs
0.84
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Text Analysis Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Complex Network Analysis Techniques
Physical Sciences →  Physics and Astronomy →  Statistical and Nonlinear Physics

Related Documents

BOOK-CHAPTER

Twitter Based Event Summarization

Amrah MaryamRashid Ali

Communications in computer and information science Year: 2018 Pages: 165-174
JOURNAL ARTICLE

Abstractive Summarization

J. BalajiT. V. GeethaRanjani Parthasarathi

Journal:   International Journal on Semantic Web and Information Systems Year: 2016 Vol: 12 (2)Pages: 76-99
JOURNAL ARTICLE

Abstractive Timeline Summarization

Julius SteenKatja Markert

Year: 2019 Pages: 21-31
© 2026 ScienceGate Book Chapters — All rights reserved.