JOURNAL ARTICLE

Nearest Neighbour Few-Shot Learning for Cross-lingual Classification

Mehwish BariBatool A HaiderSaab Mansour

Year: 2021 Journal:   Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing Pages: 1745-1753

Abstract

Even though large pre-trained multilingual models (e.g. mBERT, XLM-R) have led to significant performance gains on a wide range of cross-lingual NLP tasks, success on many downstream tasks still relies on the availability of sufficient annotated data. Traditional fine-tuning of pre-trained models using only a few target samples can cause over-fitting. This can be quite limiting as most languages in the world are under-resourced. In this work, we investigate cross-lingual adaptation using a simple nearest-neighbor few-shot (<15 samples) inference technique for classification tasks. We experiment using a total of 16 distinct languages across two NLP tasks- XNLI and PAWS-X. Our approach consistently improves traditional fine-tuning using only a handful of labeled samples in target locales. We also demonstrate its generalization capability across tasks.

Keywords:
Computer science Generalization Artificial intelligence Inference Limiting k-nearest neighbors algorithm Range (aeronautics) Shot (pellet) Machine learning Adaptation (eye) Nearest neighbour Natural language processing Pattern recognition (psychology) Mathematics

Metrics

5
Cited By
0.49
FWCI (Field Weighted Citation Impact)
50
Refs
0.65
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Speech Recognition and Synthesis
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.