Context Models For OOV Word Translation In Low-resource Languages | Awesome LLM Papers Add your paper to Awesome LLM Papers

Context Models For OOV Word Translation In Low-resource Languages

Angli Liu, Katrin Kirchhoff . Proceedings of the 26th ACM international conference on Multimedia 2018 – 82 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Compositional Generalization Interdisciplinary Approaches Multimodal Semantic Representation Neural Machine Translation Training Techniques

Out-of-vocabulary word translation is a major problem for the translation of low-resource languages that suffer from a lack of parallel training data. This paper evaluates the contributions of target-language context models towards the translation of OOV words, specifically in those cases where OOV translations are derived from external knowledge sources, such as dictionaries. We develop both neural and non-neural context models and evaluate them within both phrase-based and self-attention based neural machine translation systems. Our results show that neural language models that integrate additional context beyond the current sentence are the most effective in disambiguating possible OOV word translations. We present an efficient second-pass lattice-rescoring method for wide-context neural language models and demonstrate performance improvements over state-of-the-art self-attention based neural MT systems in five out of six low-resource language pairs.

Similar Work