Cutting-off Redundant Repeating Generations For Neural Abstractive Summarization | Awesome LLM Papers Add your paper to Awesome LLM Papers

Cutting-off Redundant Repeating Generations For Neural Abstractive Summarization

Jun Suzuki, Masaaki Nagata . Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers 2017 – 61 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL Evaluation Interdisciplinary Approaches

This paper tackles the reduction of redundant repeating generation that is often observed in RNN-based encoder-decoder models. Our basic idea is to jointly estimate the upper-bound frequency of each target vocabulary in the encoder and control the output words based on the estimation in the decoder. Our method shows significant improvement over a strong RNN-based encoder-decoder baseline and achieved its best results on an abstractive summarization benchmark.

Similar Work