JOURNAL ARTICLE

English Machine Translation Model Based on an Improved Self-Attention Technology

Wenxia Pan

Year: 2021 Journal:   Scientific Programming Vol: 2021 Pages: 1-11   Publisher: Hindawi Publishing Corporation

Abstract

English machine translation is a natural language processing research direction that has important scientific research value and practical value in the current artificial intelligence boom. The variability of language, the limited ability to express semantic information, and the lack of parallel corpus resources all limit the usefulness and popularity of English machine translation in practical applications. The self-attention mechanism has received a lot of attention in English machine translation tasks because of its highly parallelizable computing ability, which reduces the model’s training time and allows it to capture the semantic relevance of all words in the context. The efficiency of the self-attention mechanism, however, differs from that of recurrent neural networks because it ignores the position and structure information between context words. The English machine translation model based on the self-attention mechanism uses sine and cosine position coding to represent the absolute position information of words in order to enable the model to use position information between words. This method, on the other hand, can reflect relative distance but does not provide directionality. As a result, a new model of English machine translation is proposed, which is based on the logarithmic position representation method and the self-attention mechanism. This model retains the distance and directional information between words, as well as the efficiency of the self-attention mechanism. Experiments show that the nonstrict phrase extraction method can effectively extract phrase translation pairs from the n-best word alignment results and that the extraction constraint strategy can improve translation quality even further. Nonstrict phrase extraction methods and n-best alignment results can significantly improve the quality of translation translations when compared to traditional phrase extraction methods based on single alignment.

Keywords:
Computer science Machine translation Artificial intelligence Natural language processing Example-based machine translation Rule-based machine translation Transfer-based machine translation Phrase Context (archaeology)

Metrics

4
Cited By
0.56
FWCI (Field Weighted Citation Impact)
31
Refs
0.74
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Computational Techniques and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Re-Transformer: A Self-Attention Based Model for Machine Translation

Huey-Ing LiuWeilin Chen

Journal:   Procedia Computer Science Year: 2021 Vol: 189 Pages: 3-10
JOURNAL ARTICLE

Chinese-English machine translation model based on transfer learning and self-attention

Shu Ma

Journal:   DOAJ (DOAJ: Directory of Open Access Journals) Year: 2024
JOURNAL ARTICLE

Dependency-Based Self-Attention for Transformer Neural Machine Translation

Hiroyuki DeguchiAkihiro TamuraTakashi Ninomiya

Journal:   Journal of Natural Language Processing Year: 2020 Vol: 27 (3)Pages: 553-571
© 2026 ScienceGate Book Chapters — All rights reserved.