Modeling Coverage For Neural Machine Translation | Awesome LLM Papers Add your paper to Awesome LLM Papers

Modeling Coverage For Neural Machine Translation

Zhaopeng Tu, Zhengdong Lu, Yang Liu, Xiaohua Liu, Hang Li . Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2016 – 679 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL Compositional Generalization Interdisciplinary Approaches Model Architecture Neural Machine Translation

Attention mechanism has enhanced state-of-the-art Neural Machine Translation (NMT) by jointly learning to align and translate. It tends to ignore past alignment information, however, which often leads to over-translation and under-translation. To address this problem, we propose coverage-based NMT in this paper. We maintain a coverage vector to keep track of the attention history. The coverage vector is fed to the attention model to help adjust future attention, which lets NMT system to consider more about untranslated source words. Experiments show that the proposed approach significantly improves both translation quality and alignment quality over standard attention-based NMT.

Similar Work