Plan-then-generate: Controlled Data-to-text Generation Via Planning | Awesome LLM Papers Add your paper to Awesome LLM Papers

Plan-then-generate: Controlled Data-to-text Generation Via Planning

Yixuan Su, David Vandyke, Sihui Wang, Yimai Fang, Nigel Collier . Findings of the Association for Computational Linguistics: EMNLP 2021 2021 – 54 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL Applications Compositional Generalization Content Enrichment Datasets EMNLP Evaluation Interdisciplinary Approaches Neural Machine Translation RAG Tools Variational Autoencoders

Recent developments in neural networks have led to the advance in data-to-text generation. However, the lack of ability of neural models to control the structure of generated output can be limiting in certain real-world applications. In this study, we propose a novel Plan-then-Generate (PlanGen) framework to improve the controllability of neural data-to-text models. Extensive experiments and analyses are conducted on two benchmark datasets, ToTTo and WebNLG. The results show that our model is able to control both the intra-sentence and inter-sentence structure of the generated output. Furthermore, empirical comparisons against previous state-of-the-art methods show that our model improves the generation quality as well as the output diversity as judged by human and automatic evaluations.

Similar Work