Aiming at the problems of insufficient semantic understanding, fluency and accuracy of abstracts in the field of neural abstractive summarization, an automatic text summarization model is proposed. First, we introduce the decoder attention mechanism in the reference network, which effectively improves the ability to understand words and generate vocabulary words. Second, the ability to extract words from the original text is improved by using the multi-hop attention mechanism, which improves the ability of the model to process out-of-vocabulary words. The experimental results on the CNN/Daily Mail dataset show that the model performs well on the standard evaluation system and improves the summary accuracy and sentence fluency.
Tiancheng HuangGuangquan LuZexin LiJiagang SongLijuan Wu
Valentin VenzinJan DeriuDidier OrelMark Cieliebak
Yimai FangHaoyue ZhuEwa MuszyńskaAlexander KuhnleSimone Teufel
Xiaoping JiangPo HuLiwei HouWang Xia