JOURNAL ARTICLE

Domain Adaptation for Tibetan-Chinese Neural Machine Translation

Abstract

The meaning of the same word or sentence is likely to change in different semantic contexts, which challenges general-purpose translation system to maintain stable performance across different domains. Therefore, domain adaptation is an essential researching topic in Neural Machine Translation practice. In order to efficiently train translation models for different domains, in this work we take the Tibetan-Chinese general translation model as the parent model, and obtain two domain-specific Tibetan-Chinese translation models with small-scale in-domain data. The empirical results indicate that the method provides a positive approach for domain adaptation in low-resource scenarios, resulting in better bleu metrics as well as faster training speed over our general baseline models.

Keywords:
Machine translation Computer science Adaptation (eye) Translation (biology) Natural language processing Artificial intelligence Sentence Domain (mathematical analysis) Domain adaptation Meaning (existential) Word (group theory) Resource (disambiguation) Baseline (sea) Scale (ratio) Machine learning Linguistics Mathematics Geography Psychology

Metrics

4
Cited By
0.44
FWCI (Field Weighted Citation Impact)
3
Refs
0.71
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.