JOURNAL ARTICLE

The Subword‐Character Multi‐Scale Transformer With Learnable Positional Encoding for Machine Translation

Wenjing YaoWei Zhou

Year: 2025 Journal:   Engineering Reports Vol: 7 (7)   Publisher: Wiley

Abstract

ABSTRACT The transformer model addresses the efficiency bottleneck caused by sequential computation in traditional recurrent neural networks (RNN) by leveraging the self‐attention mechanism to parallelize the capture of global dependencies. The subword‐level modeling units and fixed‐pattern positional encoding adopted by mainstream methods struggle to adequately capture fine‐grained feature information in morphologically rich languages, limiting the model's flexible learning of target‐side word order patterns. To address these challenges, this study innovatively constructs a subword‐character multi‐scale transformer architecture integrated with a learnable positional encoding mechanism. The model abandons traditional fixed‐pattern positional encodings, enabling autonomous optimization of the positional representation space for source and target languages through end‐to‐end training mechanisms, significantly enhancing dynamic adaptability in cross‐linguistic positional mapping. While preserving the global semantic modeling advantages of subword units, the framework introduces a lightweight‐designed character‐level branch to supplement fine‐grained features. For the fusion of subword and character branches, we employ context‐aware cross‐attention to enable dynamic integration of linguistic information at different granularities. Our model achieves notable improvements in BLEU scores on the WMT'14 English‐German (En‐De), WMT'17 Chinese‐English (Zh‐En), and WMT'16 English‐Romanian (En‐Ro) benchmark tasks. These results demonstrate the synergistic effects of fine‐grained multi‐scale modeling and learnable positional encoding in enhancing translation quality and linguistic adaptability.

Keywords:
Transformer Computer science Character (mathematics) Machine translation Encoding (memory) Natural language processing Translation (biology) Artificial intelligence Speech recognition Mathematics Engineering Voltage Electrical engineering Biology Genetics

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
48
Refs
0.10
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Handwritten Text Recognition Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.