Incremental Transformer With Deliberation Decoder For Document Grounded Conversations | Awesome LLM Papers Add your paper to Awesome LLM Papers

Incremental Transformer With Deliberation Decoder For Document Grounded Conversations

Zekang Li, Cheng Niu, Fandong Meng, Yang Feng, Qian Li, Jie Zhou . Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019 – 105 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL Datasets Interdisciplinary Approaches Model Architecture

Document Grounded Conversations is a task to generate dialogue responses when chatting about the content of a given document. Obviously, document knowledge plays a critical role in Document Grounded Conversations, while existing dialogue models do not exploit this kind of knowledge effectively enough. In this paper, we propose a novel Transformer-based architecture for multi-turn document grounded conversations. In particular, we devise an Incremental Transformer to encode multi-turn utterances along with knowledge in related documents. Motivated by the human cognitive process, we design a two-pass decoder (Deliberation Decoder) to improve context coherence and knowledge correctness. Our empirical study on a real-world Document Grounded Dataset proves that responses generated by our model significantly outperform competitive baselines on both context coherence and knowledge relevance.

Similar Work