Unik-qa: Unified Representations Of Structured And Unstructured Knowledge For Open-domain Question Answering | Awesome LLM Papers Add your paper to Awesome LLM Papers

Unik-qa: Unified Representations Of Structured And Unstructured Knowledge For Open-domain Question Answering

Barlas Oguz, Xilun Chen, Vladimir Karpukhin, Stan Peshterliev, Dmytro Okhonko, Michael Schlichtkrull, Sonal Gupta, Yashar Mehdad, Scott Yih . Findings of the Association for Computational Linguistics: NAACL 2022 2022 – 66 citations

[Code] [Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL Has Code Interdisciplinary Approaches NAACL Question Answering Retrieval Systems

We study open-domain question answering with structured, unstructured and semi-structured knowledge sources, including text, tables, lists and knowledge bases. Departing from prior work, we propose a unifying approach that homogenizes all sources by reducing them to text and applies the retriever-reader model which has so far been limited to text sources only. Our approach greatly improves the results on knowledge-base QA tasks by 11 points, compared to latest graph-based methods. More importantly, we demonstrate that our unified knowledge (UniK-QA) model is a simple and yet effective way to combine heterogeneous sources of knowledge, advancing the state-of-the-art results on two popular question answering benchmarks, NaturalQuestions and WebQuestions, by 3.5 and 2.6 points, respectively. The code of UniK-QA is available at: https://github.com/facebookresearch/UniK-QA.

Similar Work