JOURNAL ARTICLE

Knowledge distillation for BERT unsupervised domain adaptation

Minho RyuGeonseok LeeKichun Lee

Year: 2022 Journal:   Knowledge and Information Systems Vol: 64 (11)Pages: 3113-3128   Publisher: Springer Science+Business Media
Keywords:
Computer science Discriminative model Artificial intelligence Domain adaptation Distillation Machine learning Domain (mathematical analysis) Adaptation (eye) Language model Natural language processing Task (project management) Adversarial system Labeled data Pattern recognition (psychology) Classifier (UML) Mathematics

Metrics

33
Cited By
6.07
FWCI (Field Weighted Citation Impact)
34
Refs
0.95
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.