JOURNAL ARTICLE

Document-Level Neural Machine Translation with Hierarchical Attention Networks

Miculicich Werlen, LeslyRam, DhananjayPappas, NikolaosHenderson, James

Year: 2018 Journal:   Zenodo (CERN European Organization for Nuclear Research)   Publisher: European Organization for Nuclear Research

Abstract

Neural Machine Translation (NMT) can be improved by including document-level contextual information. For this purpose, we propose a hierarchical attention model to capture the context in a structured and dynamic manner. The model is integrated in the original NMT architecture as another level of abstraction, conditioning on the NMT model’s own previous hidden states. Experiments show that hierarchical attention significantly improves the BLEU score over a strong NMT baseline with the state-of-the-art in context-aware methods, and that both the encoder and decoder benefit from context in complementary ways.

Keywords:
Machine translation Context (archaeology) Translation (biology) Artificial neural network Encoder Baseline (sea) Hierarchical database model Attention network

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.38
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.