Modeling Past And Future For Neural Machine Translation | Awesome LLM Papers Add your paper to Awesome LLM Papers

Modeling Past And Future For Neural Machine Translation

Zaixiang Zheng, Hao Zhou, Shujian Huang, Lili Mou, Xinyu Dai, Jiajun Chen, Zhaopeng Tu . Transactions of the Association for Computational Linguistics 2018 – 52 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL Interdisciplinary Approaches Neural Machine Translation TACL

Existing neural machine translation systems do not explicitly model what has been translated and what has not during the decoding phase. To address this problem, we propose a novel mechanism that separates the source information into two parts: translated Past contents and untranslated Future contents, which are modeled by two additional recurrent layers. The Past and Future contents are fed to both the attention model and the decoder states, which offers NMT systems the knowledge of translated and untranslated contents. Experimental results show that the proposed approach significantly improves translation performance in Chinese-English, German-English and English-German translation tasks. Specifically, the proposed model outperforms the conventional coverage model in both of the translation quality and the alignment error rate.

Similar Work