Kehai ChenRui WangMasao UtiyamaEiichiro SumitaTiejun Zhao
Traditional neural machine translation (NMT) methods use the word-level context to predict target language translation while neglecting the sentence-level context, which has been shown to be beneficial for translation prediction in statistical machine translation. This paper represents the sentence-level context as latent topic representations by using a convolution neural network, and designs a topic attention to integrate source sentence-level topic context information into both attention-based and Transformer-based NMT. In particular, our method can improve the performance of NMT by modeling source topics and translations jointly. Experiments on the large-scale LDC Chinese-to-English translation tasks and WMT'14 English-to-German translation tasks show that the proposed approach can achieve significant improvements compared with baseline systems.
Mingming YangRui WangKehai ChenMasao UtiyamaEiichiro SumitaMin ZhangTiejun Zhao
Longyue WangZhaopeng TuAndy WayQun Liu
Tian WuZhongjun HeEnhong ChenHaifeng Wang
Zhang LiZhirui ZhangBoxing ChenWeihua LuoLuo Si