Pretrained Transformers For Simple Question Answering Over Knowledge Graphs | Awesome LLM Papers Contribute to Awesome LLM Papers

Pretrained Transformers For Simple Question Answering Over Knowledge Graphs

D. Lukovnikov, A. Fischer, J. Lehmann . Lecture Notes in Computer Science 2019 – 59 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Uncategorized

Answering simple questions over knowledge graphs is a well-studied problem in question answering. Previous approaches for this task built on recurrent and convolutional neural network based architectures that use pretrained word embeddings. It was recently shown that finetuning pretrained transformer networks (e.g. BERT) can outperform previous approaches on various natural language processing tasks. In this work, we investigate how well BERT performs on SimpleQuestions and provide an evaluation of both BERT and BiLSTM-based models in datasparse scenarios.

Similar Work