Sequence To Backward And Forward Sequences: A Content-introducing Approach To Generative Short-text Conversation | Awesome LLM Papers Contribute to Awesome LLM Papers

Sequence To Backward And Forward Sequences: A Content-introducing Approach To Generative Short-text Conversation

Lili Mou, Yiping Song, Rui Yan, Ge Li, Lu Zhang, Zhi Jin . Arxiv 2016 – 200 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Uncategorized

Using neural networks to generate replies in human-computer dialogue systems is attracting increasing attention over the past few years. However, the performance is not satisfactory: the neural network tends to generate safe, universally relevant replies which carry little meaning. In this paper, we propose a content-introducing approach to neural network-based generative dialogue systems. We first use pointwise mutual information (PMI) to predict a noun as a keyword, reflecting the main gist of the reply. We then propose seq2BF, a “sequence to backward and forward sequences” model, which generates a reply containing the given keyword. Experimental results show that our approach significantly outperforms traditional sequence-to-sequence models in terms of human evaluation and the entropy measure, and that the predicted keyword can appear at an appropriate position in the reply.

Similar Work