Finding The Optimal Vocabulary Size For Neural Machine Translation | Awesome LLM Papers Add your paper to Awesome LLM Papers

Finding The Optimal Vocabulary Size For Neural Machine Translation

Thamme Gowda, Jonathan May . Findings of the Association for Computational Linguistics: EMNLP 2020 2020 – 45 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL EMNLP Interdisciplinary Approaches Neural Machine Translation Training Techniques

We cast neural machine translation (NMT) as a classification task in an autoregressive setting and analyze the limitations of both classification and autoregression components. Classifiers are known to perform better with balanced class distributions during training. Since the Zipfian nature of languages causes imbalanced classes, we explore its effect on NMT. We analyze the effect of various vocabulary sizes on NMT performance on multiple languages with many data sizes, and reveal an explanation for why certain vocabulary sizes are better than others.

Similar Work