Traditional document summarization models perform less satisfactorily on dialogues due to the complex personal pronouns referential relationships and insufficient modeling of conversation. To address this problem, we propose a novel end-to-end Transformer-based model for abstractive dialogue summarization with Relation Enhanced method based on BART named RE-BART. Our model leverages local relation and global relation in a conversation to model dialogue and to generate better summaries. In detail, we consider that the verb and related arguments in a single utterance contribute to the local event for encoding the dialogue. And coreference information in a whole conversation represents the global relation which helps to trace the topic and information flow of the speakers. Then we design a dialogue relation enhanced model for modeling both information. Experiments on the SAMsum dataset show that our model outperforms various dialogue summarization approaches and achieves new state-of- the-art ROUGE results.
Junpeng LiuYanyan ZouYuxuan XiShengjie LiMian MaZhuoye DingBo Long
Zepeng HaoTaihua ShaoShengwei ZhouHonghui Chen
Rajendra Kumar RoulPratik JoshiJajati Keshari Sahoo
Jiaxin DuanFengyu LuJunfei Liu