The meaning of the same word or sentence is likely to change in different semantic contexts, which challenges general-purpose translation system to maintain stable performance across different domains. Therefore, domain adaptation is an essential researching topic in Neural Machine Translation practice. In order to efficiently train translation models for different domains, in this work we take the Tibetan-Chinese general translation model as the parent model, and obtain two domain-specific Tibetan-Chinese translation models with small-scale in-domain data. The empirical results indicate that the method provides a positive approach for domain adaptation in low-resource scenarios, resulting in better bleu metrics as well as faster training speed over our general baseline models.
Wen LaiXiaobing ZhaoXiaqing Li
Chao TangZehua LvXiming YuanQ RenS LiX WeiS A MohamedA ElsayedY F HassanCai ZangtaiRenqing DongzhuToudan TseringNyima TashiLi YachaoXiong DeyiZhang MinA VaswaniN ShazeerN ParmarSangye DanzhuZhang TianyuRen HuanWang XuguangH LiuChenZhang JianguoZheng GuokaiLi ZhixinPeng ZhiTang SuqinYou CongcongGao ShengxiangYu ZhengtaoJia HaoWang XuJi BaijunAishan WumaierAihamit SirajXire'ali RuzemamitHailera
Ding LiuYachao LiDengyun ZhuXuan LiuNing MaAo Zhu