Case-based Reasoning For Natural Language Queries Over Knowledge Bases | Awesome LLM Papers Add your paper to Awesome LLM Papers

Case-based Reasoning For Natural Language Queries Over Knowledge Bases

Rajarshi Das, Manzil Zaheer, Dung Thai, Ameya Godbole, Ethan Perez, Jay-Yoon Lee, Lizhen Tan, Lazaros Polymenakos, Andrew McCallum . Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021 – 99 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Datasets Question Answering

It is often challenging to solve a complex problem from scratch, but much easier if we can access other similar problems with their solutions – a paradigm known as case-based reasoning (CBR). We propose a neuro-symbolic CBR approach (CBR-KBQA) for question answering over large knowledge bases. CBR-KBQA consists of a nonparametric memory that stores cases (question and logical forms) and a parametric model that can generate a logical form for a new question by retrieving cases that are relevant to it. On several KBQA datasets that contain complex questions, CBR-KBQA achieves competitive performance. For example, on the ComplexWebQuestions dataset, CBR-KBQA outperforms the current state of the art by 11% on accuracy. Furthermore, we show that CBR-KBQA is capable of using new cases without any further training: by incorporating a few human-labeled examples in the case memory, CBR-KBQA is able to successfully generate logical forms containing unseen KB entities as well as relations.

Similar Work