Neural Machine Translation With Monolingual Translation Memory | Awesome LLM Papers Add your paper to Awesome LLM Papers

Neural Machine Translation With Monolingual Translation Memory

Deng Cai, Yan Wang, Huayang Li, Wai Lam, Lemao Liu . Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) 2021 – 49 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL Fine Tuning Interdisciplinary Approaches Neural Machine Translation Retrieval Systems Tools

Prior work has proved that Translation memory (TM) can boost the performance of Neural Machine Translation (NMT). In contrast to existing work that uses bilingual corpus as TM and employs source-side similarity search for memory retrieval, we propose a new framework that uses monolingual memory and performs learnable memory retrieval in a cross-lingual manner. Our framework has unique advantages. First, the cross-lingual memory retriever allows abundant monolingual data to be TM. Second, the memory retriever and NMT model can be jointly optimized for the ultimate translation goal. Experiments show that the proposed method obtains substantial improvements. Remarkably, it even outperforms strong TM-augmented NMT baselines using bilingual TM. Owning to the ability to leverage monolingual data, our model also demonstrates effectiveness in low-resource and domain adaptation scenarios.

Similar Work