JOURNAL ARTICLE

Improving Context-Aware Neural Machine Translation Using Self-Attentive Sentence Embedding

Hyeongu YunYongkeun HwangKyomin Jung

Year: 2020 Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Vol: 34 (05)Pages: 9498-9506   Publisher: Association for the Advancement of Artificial Intelligence

Abstract

Fully Attentional Networks (FAN) like Transformer (Vaswani et al. 2017) has shown superior results in Neural Machine Translation (NMT) tasks and has become a solid baseline for translation tasks. More recent studies also have reported experimental results that additional contextual sentences improve translation qualities of NMT models (Voita et al. 2018; Müller et al. 2018; Zhang et al. 2018). However, those studies have exploited multiple context sentences as a single long concatenated sentence, that may cause the models to suffer from inefficient computational complexities and long-range dependencies. In this paper, we propose Hierarchical Context Encoder (HCE) that is able to exploit multiple context sentences separately using the hierarchical FAN structure. Our proposed encoder first abstracts sentence-level information from preceding sentences in a self-attentive way, and then hierarchically encodes context-level information. Through extensive experiments, we observe that our HCE records the best performance measured in BLEU score on English-German, English-Turkish, and English-Korean corpus. In addition, we observe that our HCE records the best performance in a crowd-sourced test set which is designed to evaluate how well an encoder can exploit contextual information. Finally, evaluation on English-Korean pronoun resolution test suite also shows that our HCE can properly exploit contextual information.

Keywords:
Computer science Machine translation Exploit Artificial intelligence Sentence Natural language processing Encoder Test set Context (archaeology) Speech recognition

Metrics

20
Cited By
1.54
FWCI (Field Weighted Citation Impact)
42
Refs
0.86
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

Context-Aware Neural Machine Translation using Selected Context

Sami Ul HaqSadaf Abdul RaufArslan ShaukatMuhammad Hassan Arif

Journal:   2022 19th International Bhurban Conference on Applied Sciences and Technology (IBCAST) Year: 2022 Pages: 349-352
BOOK-CHAPTER

Improving Context-Aware Neural Machine Translation with Target-Side Context

Hayahide YamagishiMamoru Komachi

Communications in computer and information science Year: 2020 Pages: 112-122
JOURNAL ARTICLE

Context-aware neural machine translation

Herold, Christian

Journal:   RWTH Publications (RWTH Aachen) Year: 2024
© 2026 ScienceGate Book Chapters — All rights reserved.