JOURNAL ARTICLE

Isotropic Representation Can Improve Zero-Shot Cross-Lingual Transfer on Multilingual Language Models

Abstract

With the development of multilingual pre-trained language models (mPLMs), zero-shot cross-lingual transfer shows great potential. To further improve the performance of cross-lingual transfer, many studies have explored representation misalignment caused by morphological differences but neglected the misalignment caused by the anisotropic distribution of contextual representations. In this work, we propose enhanced isotropy and constrained code-switching for zero-shot cross-lingual transfer to alleviate the problem of misalignment caused by the anisotropic representations and maintain syntactic structural knowledge. Extensive experiments on three zero-shot cross-lingual transfer tasks demonstrate that our method gains significant improvements over strong mPLM backbones and further improves the state-of-the-art methods.

Keywords:
Isotropy Computer science Zero (linguistics) Representation (politics) Transfer (computing) Anisotropy Shot (pellet) Artificial intelligence Code (set theory) Natural language processing Physics Optics Materials science Linguistics Programming language Parallel computing

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
61
Refs
0.17
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.