Sentence Simplification With Memory-augmented Neural Networks | Awesome LLM Papers Add your paper to Awesome LLM Papers

Sentence Simplification With Memory-augmented Neural Networks

Tu Vu, Baotian Hu, Tsendsuren Munkhdalai, Hong Yu . Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers) 2018 – 47 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL Applications Datasets Evaluation Interdisciplinary Approaches Model Architecture NAACL Neural Machine Translation Variational Autoencoders

Sentence simplification aims to simplify the content and structure of complex sentences, and thus make them easier to interpret for human readers, and easier to process for downstream NLP applications. Recent advances in neural machine translation have paved the way for novel approaches to the task. In this paper, we adapt an architecture with augmented memory capacities called Neural Semantic Encoders (Munkhdalai and Yu, 2017) for sentence simplification. Our experiments demonstrate the effectiveness of our approach on different simplification datasets, both in terms of automatic evaluation measures and human judgments.

Similar Work