In this paper we aim to relieve the issue of lexical translation inconsistency for document-level neural machine translation (NMT) by modeling consistency preference for lexical chains, which consist of repeated words in a source-side document and provide a representation of the lexical consistency structure of the document. Specifically, we first propose lexical-consistency attention to capture consistency context among words in the same lexical chains. Then for each lexical chain we define and learn a consistency-tailored latent variable, which will guide the translation of corresponding sentences to enhance lexical translation consistency. Experimental results on Chinese→English and French→English document-level translation tasks show that our approach not only significantly improves translation performance in BLEU, but also substantially alleviates the problem of the lexical translation inconsistency.
Xinglin LyuJunhui LiZhengxian GongM. Zhang
Xiaomian KangYang ZhaoJiajun ZhangChengqing Zong
Eva Martínez GarcíaCarles CreusCristina España-BonetLluı́s Màrquez
Junxuan ChenXiang LiJiarui ZhangChulun ZhouJianwei CuiBin WangJinsong Su