JOURNAL ARTICLE

STA: An efficient data augmentation method for low-resource neural machine translation

Fuxue LiChuncheng ChiHong YanBeibei LiuMingzhi Shao

Year: 2023 Journal:   Journal of Intelligent & Fuzzy Systems Vol: 45 (1)Pages: 121-132   Publisher: IOS Press

Abstract

Transformer-based neural machine translation (NMT) has achieved state-of-the-art performance in the NMT paradigm. However, it relies on the availability of copious parallel corpora. For low-resource language pairs, the amount of parallel data is insufficient, resulting in poor translation quality. To alleviate this issue, this paper proposes an efficient data augmentation (DA) method named STA. Firstly, the pseudo-parallel sentence pairs are generated by translating sentence trunks with the target-to-source NMT model. Furthermore, two strategies are introduced to merge the original data and pseudo-parallel corpus to augment the training set. Experimental results on simulated and real low-resource translation tasks show that the proposed method improves the translation quality over the strong baseline, and also outperforms other data augmentation methods. Moreover, the STA method can further improve the translation quality when combined with the back-translation method with the extra monolingual data.

Keywords:
Machine translation Computer science Merge (version control) Transformer Sentence Artificial intelligence Translation (biology) Natural language processing Training set Speech recognition Information retrieval Voltage

Metrics

6
Cited By
1.53
FWCI (Field Weighted Citation Impact)
31
Refs
0.81
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.