JOURNAL ARTICLE

Bilingual Alignment Pre-Training for Zero-Shot Cross-Lingual Transfer

Ziqing YangWentao MaYiming CuiJiani YeWanxiang CheShijin Wang

Year: 2021 Journal:   arXiv (Cornell University) Pages: 100-105   Publisher: Cornell University

Abstract

Multilingual pre-trained models have achieved remarkable performance on cross-lingual transfer learning. Some multilingual models such as mBERT, have been pre-trained on unlabeled corpora, therefore the embeddings of different languages in the models may not be aligned very well. In this paper, we aim to improve the zero-shot cross-lingual transfer performance by proposing a pre-training task named Word-Exchange Aligning Model (WEAM), which uses the statistical alignment information as the prior knowledge to guide cross-lingual word prediction. We evaluate our model on multilingual machine reading comprehension task MLQA and natural language interface task XNLI. The results show that WEAM can significantly improve the zero-shot performance.

Keywords:
Computer science Natural language processing Task (project management) Artificial intelligence Zero (linguistics) Word (group theory) Transfer (computing) Transfer of learning Reading comprehension Reading (process) Linguistics

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
9
Refs
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Text Readability and Simplification
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

BAD-X: Bilingual Adapters Improve Zero-Shot Cross-Lingual Transfer

Marinela ParovićGoran GlavaššIvan VulićAnna Korhonen

Journal:   Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies Year: 2022 Pages: 1791-1799
JOURNAL ARTICLE

Cross-Lingual Pre-Training Based Transfer for Zero-Shot Neural Machine Translation

Baijun JiZhirui ZhangXiangyu DuanMin ZhangBoxing ChenWeihua Luo

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2020 Vol: 34 (01)Pages: 115-122
JOURNAL ARTICLE

Improving Zero-Shot Cross-Lingual Transfer Learning via Robust Training

Kuan-Hao HuangWasi Uddin AhmadNanyun PengKai-Wei Chang

Journal:   Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing Year: 2021 Pages: 1684-1697
© 2026 ScienceGate Book Chapters — All rights reserved.