JOURNAL ARTICLE

Knowledge‐Grounded Attention‐Based Neural Machine Translation Model

Huma IsrarSafdar Abbas KhanMuhammad Ali TahirMuhammad K. ShahzadMuneer AhmadJasni Mohamad Zain

Year: 2025 Journal:   Applied Computational Intelligence and Soft Computing Vol: 2025 (1)   Publisher: Hindawi Publishing Corporation

Abstract

Neural machine translation (NMT) model processes sentences in isolation and ignores additional contextual or side information beyond sentences. The input text alone often provides limited knowledge to generate contextually correct and meaningful translation. Relying solely on the input text could yield translations that lack accuracy. Side information related to either source or target side is helpful in the context of NMT. In this study, we empirically show that training an NMT model with target‐side additional information used as knowledge can significantly improve the translation quality. The acquired knowledge is leveraged in the encoder‐/decoder‐based model utilizing multiencoder framework. The additional encoder converts knowledge into dense semantic representation called attention. These attentions from the input sentence and additional knowledge are then combined into a unified attention. The decoder generates the translation by conditioning on both the input text and acquired knowledge. Evaluation of translation from Urdu to English with a low‐resource setting yields promising results in terms of both perplexity reduction and improved BLEU scores. The proposed models in the respective group outperform in LSTM and GRU with attention mechanism by +3.1 and +2.9 BLEU score, respectively. Extensive analysis confirms our claim that the translations influenced by additional information may occasionally contain rare low‐frequency words and faithful translation. Experimental results on a different language pair DE‐EN demonstrate that our suggested method is more efficient and general.

Keywords:
Computer science Machine translation Artificial intelligence Translation (biology) Natural language processing Human–computer interaction

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
34
Refs
0.00
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Semantic Web and Ontologies
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Bilingual attention based neural machine translation

Liyan KangShaojie HeMingxuan WangFei LongJinsong Su

Journal:   Applied Intelligence Year: 2022 Vol: 53 (4)Pages: 4302-4315
JOURNAL ARTICLE

Group-attention Based Neural Machine Translation

Renyu HuHao XuYang XiaoChenjun WuHaiyang Jia

Journal:   IOP Conference Series Materials Science and Engineering Year: 2020 Vol: 782 (2)Pages: 022080-022080
JOURNAL ARTICLE

Neural Machine Translation with Target-Attention Model

Mingming YangMin ZhangKehai ChenRui WangTiejun Zhao

Journal:   IEICE Transactions on Information and Systems Year: 2020 Vol: E103.D (3)Pages: 684-694
© 2026 ScienceGate Book Chapters — All rights reserved.