JOURNAL ARTICLE

Abstractive meeting summarization based on an attentional neural model

Abstract

Through the ages, in all nations, at all times, people spend a lot of their time on discussing new and important issues either on meetings or in conferences. With the evolution and the abundance of Automatic Speech Recognition (ASR) frameworks, automatic transcripts and even automatic meeting summarization are getting more and more interesting. Recently, automatic summarization faces deeper progresses on speech summarization. Neural models had been introduced to tackle with many difficulties of abstractive summarization. Our contribution in this paper focuses on these weaknesses of neural abstractive meeting summarization and suggests an encoder-decoder model based on an attentional algorithm on the decoding sequence. We proposed a deep encoder-decoder model based on attention mechanism (DEDA) for ASR transcripts. Experiments on the AMI Dataset demonstrates that our proposed method ensured competitive results with the state of the art even on extractive or abstractive models. The experimental analyses also put the stress on the performance of the summarized utterances as well as the reduction of the occurrence repetition in summaries.

Keywords:
Automatic summarization Computer science Decoding methods Artificial intelligence Encoder Artificial neural network Speech recognition Deep learning Repetition (rhetorical device) Natural language processing Linguistics

Metrics

3
Cited By
0.42
FWCI (Field Weighted Citation Impact)
0
Refs
0.67
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Text Analysis Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.