Zero-resource Translation With Multi-lingual Neural Machine Translation | Awesome LLM Papers Contribute to Awesome LLM Papers

Zero-resource Translation With Multi-lingual Neural Machine Translation

Orhan Firat, Baskaran Sankaran, Yaser Al-Onaizan, Fatos T. Yarman Vural, Kyunghyun Cho . Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing 2016 – 274 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
EMNLP Uncategorized

In this paper, we propose a novel finetuning algorithm for the recently introduced multi-way, mulitlingual neural machine translate that enables zero-resource machine translation. When used together with novel many-to-one translation strategies, we empirically show that this finetuning algorithm allows the multi-way, multilingual model to translate a zero-resource language pair (1) as well as a single-pair neural translation model trained with up to 1M direct parallel sentences of the same language pair and (2) better than pivot-based translation strategy, while keeping only one additional copy of attention-related parameters.

Similar Work