Neural Text Generation: Past, Present And Beyond | Awesome LLM Papers Add your paper to Awesome LLM Papers

Neural Text Generation: Past, Present And Beyond

Sidi Lu, Yaoming Zhu, Weinan Zhang, Jun Wang, Yong Yu . 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition 2018 – 458 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
3d Representation CVPR Compositional Generalization Content Enrichment Datasets Interdisciplinary Approaches Multimodal Semantic Representation RAG Reinforcement Learning Security Survey Paper Training Techniques Variational Autoencoders Visual Question Answering

This paper presents a systematic survey on recent development of neural text generation models. Specifically, we start from recurrent neural network language models with the traditional maximum likelihood estimation training scheme and point out its shortcoming for text generation. We thus introduce the recently proposed methods for text generation based on reinforcement learning, re-parametrization tricks and generative adversarial nets (GAN) techniques. We compare different properties of these models and the corresponding techniques to handle their common problems such as gradient vanishing and generation diversity. Finally, we conduct a benchmarking experiment with different types of neural text generation models on two well-known datasets and discuss the empirical results along with the aforementioned model properties.

Similar Work