Transliterating named entities from one language into another can be approached as neural machine translation (NMT) problem, for which we use deep attentional RNN encoder-decoder models. To build a strong transliteration system, we apply well-established techniques from NMT, such as dropout regularization, model ensembling, rescoring with right-to-left models, and back-translation. Our submission to the NEWS 2018 Shared Task on Named Entity Transliteration ranked first in several tracks.
Leiying ZhouWenjie LuJie ZhouKui MengGongshen Liu
Jinghui YanJiajun ZhangJinan XuChengqing Zong
Asif EkbalSudip Kumar NaskarSivaji Bandyopadhyay
J FinkelC ManningV YadavS BethardP KoehnM JohnsonE MatusovP WilkenY GeorgakopoulouM GraaY KimJ SchamperS KhadiviH NeyN PhuocC.-Y OckM.-T LuongC ManningQ NgoW WiniwarterG LampleM BallesterosS SubramanianK KawakamiC DyerS HochreiterJ SchmidhuberH MayerF GomezD WierstraI NagyA KnollJ SchmidhuberF.-F LiP PeronaG QiuL GetoorB TaskarQ.-P NguyenA.-D VoJ.-C ShinP TranC.-Y OckT LuongH PhamC ManningA VaswaniH BahuleyanL MouO VechtomovaP PoupartS HochreiterH KamigaitoK HayashiT HiraoH TakamuraM OkumuraM NagataG KleinY KimY DengJ SenellartA RushD NguyenD NguyenT VuM DrasM JohnsonK PapineniS RoukosT WardW.-J ZhuM SnoverB DorrR SchwartzL Micciulla