Explicit Utilization Of General Knowledge In Machine Reading Comprehension | Awesome LLM Papers Add your paper to Awesome LLM Papers

Explicit Utilization Of General Knowledge In Machine Reading Comprehension

Chao Wang, Hui Jiang . Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019 – 46 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL Interdisciplinary Approaches Model Architecture Neural Machine Translation Question Answering Security Training Techniques

To bridge the gap between Machine Reading Comprehension (MRC) models and human beings, which is mainly reflected in the hunger for data and the robustness to noise, in this paper, we explore how to integrate the neural networks of MRC models with the general knowledge of human beings. On the one hand, we propose a data enrichment method, which uses WordNet to extract inter-word semantic connections as general knowledge from each given passage-question pair. On the other hand, we propose an end-to-end MRC model named as Knowledge Aided Reader (KAR), which explicitly uses the above extracted general knowledge to assist its attention mechanisms. Based on the data enrichment method, KAR is comparable in performance with the state-of-the-art MRC models, and significantly more robust to noise than them. When only a subset (20%-80%) of the training examples are available, KAR outperforms the state-of-the-art MRC models by a large margin, and is still reasonably robust to noise.

Similar Work