Speaker-aware BERT For Multi-turn Response Selection In Retrieval-based Chatbots · Awesome LLM Papers Contribute to LLM-Bible

Speaker-aware BERT For Multi-turn Response Selection In Retrieval-based Chatbots

Jia-chen Gu et al.. CIKM '20: The 29th ACM International Conference on Information and Knowledge Management 2020 – 54 citations

[Paper]    
BERT Fine-Tuning Model Architecture Evaluation

In this paper, we study the problem of employing pre-trained language models for multi-turn response selection in retrieval-based chatbots. A new model, named Speaker-Aware BERT (SA-BERT), is proposed in order to make the model aware of the speaker change information, which is an important and intrinsic property of multi-turn dialogues. Furthermore, a speaker-aware disentanglement strategy is proposed to tackle the entangled dialogues. This strategy selects a small number of most important utterances as the filtered context according to the speakers’ information in them. Finally, domain adaptation is performed to incorporate the in-domain knowledge into pre-trained language models. Experiments on five public datasets show that our proposed model outperforms the present models on all metrics by large margins and achieves new state-of-the-art performances for multi-turn response selection.

Similar Work