Zijian Győző YangLászló János Laki
Generative ability is a crucial need for artificial intelligence applications, such as chatbots, virtual assistants, machine translation systems etc.In recent years, the transformer-based neural architectures gave a huge boost to generate human-like English texts.In our research we did experiments to create pre-trained generative transformer models for Hungarian language and fine-tune them for multiple types of natural language processing tasks.In our focus, multilingual models were trained.We have pre-trained a multilingual BART, then fine-tuned it to various NLP tasks, such as text classification, abstractive summarization.In our experiments, we focused on transfer learning techniques to increase the performance.Furthermore, a M2M100 multilingual model was fine-tuned for a 12-lingual Hungarian-Centric machine translation.Last but not least, a Marian NMT based machine translation system was also built from scratch for the 12-lingual Hungarian-Centric machine translation task.In our results, using the cross-lingual transfer method we could achieve higher performance in all of our tasks.In our machine translation experiment, using our fine-tuned M2M100 model we could outperform the Google Translate, Microsoft Translator and eTranslation.
Alsu I. KhabibrakhmanovaИ. С. АлексеевР. Ф. Тазиева
Barbara SimonÁdám HartvégLehel Dénes-FazakasGyörgy EignerLászló Szilágyi
Gregory GrefenstetteFrédérique Segond