Sami Ul HaqSadaf Abdul RaufArslan ShaukatMuhammad Hassan Arif
Context-aware neural machine translation has attracted much attention recently by promising sophisticated contextual information integration into conventional neural machine translation. However, context-aware NMT is challenged with effective context aggregation and increased training time due integration of extra information. In this work, we study the effect of encoding selective contextual information using pre-trained models for effective contextual integration and performance optimization. We conduct experiments on different context selection methods and quantify that encoding selected context significantly reduces the training time while maintaining superiority over sentence-level NMT models. Specifically, we experimented on IWSLT English↔German translation task and show encoding selected keywords as context is sufficient and obtains best translation results.
Eva Martínez GarcíaCarles CreusCristina España-Bonet
Linghao JinJacqueline HeJonathan MayXuezhe Ma