JOURNAL ARTICLE

Exploring Cross-Lingual Transfer Learning with Unsupervised Machine Translation

Abstract

In Natural Language Understanding (NLU), to facilitate Cross-Lingual Transfer Learning (CLTL), especially CLTL between distant languages, we integrate CLTL with Machine Translation (MT), and thereby propose a novel CLTL model named Translation Aided Language Learner (TALL).TALL is constructed as a standard transformer, where the encoder is a pre-trained multilingual language model.The training of TALL includes an MT-oriented pre-training and an NLU-oriented fine-tuning.To make use of unannotated data, we implement the recently proposed Unsupervised Machine Translation (UMT) technique in the MToriented pre-training of TALL.The experimental results show that the application of UMT enables TALL to consistently achieve better CLTL performance than our baseline model, which is the pre-trained multilingual language model serving as the encoder of TALL, without using more annotated data, and the performance gain is relatively prominent in the case of distant languages.

Keywords:
Machine translation Computer science Transformer Natural language processing Artificial intelligence Transfer of learning Encoder Translation (biology) Unsupervised learning Training set Language model Engineering

Metrics

3
Cited By
0.42
FWCI (Field Weighted Citation Impact)
40
Refs
0.68
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.