Weijun YangZhi-Cheng TangXinhuai Tang
Recently, the attentional seq2seq model had made a remarkable progress on the abstractive summarization.But most of these models do not considers the relation between original sentences, which is the important feature in extractive method.In this work, we proposed a Hierarchical Neural model to address problem.First, we use a self-attention to discovers the relation between original sentences.Secondly, we use a copy mechanism to solve the OOV problem.The experiment demonstrates that our model achieves state-of-the-art ROUGE scores on LCSTS dataset.
Raymond LiChuyuan LiGabriel MurrayGiuseppe Carenini
Aniv ChakravartyJagadish S. Kallimani
Xiangyu DuanHongfei YuMingming YinMin ZhangWeihua LuoYue Zhang