Deterministic Non-autoregressive Neural Sequence Modeling By Iterative Refinement | Awesome LLM Papers Contribute to Awesome LLM Papers

Deterministic Non-autoregressive Neural Sequence Modeling By Iterative Refinement

Jason Lee, Elman Mansimov, Kyunghyun Cho . Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing 2018 – 403 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
EMNLP Uncategorized

We propose a conditional non-autoregressive neural sequence model based on iterative refinement. The proposed model is designed based on the principles of latent variable models and denoising autoencoders, and is generally applicable to any sequence generation task. We extensively evaluate the proposed model on machine translation (En-De and En-Ro) and image caption generation, and observe that it significantly speeds up decoding while maintaining the generation quality comparable to the autoregressive counterpart.

Similar Work