Consensus Attention-based Neural Networks For Chinese Reading Comprehension | Awesome LLM Papers Add your paper to Awesome LLM Papers

Consensus Attention-based Neural Networks For Chinese Reading Comprehension

Yiming Cui, Ting Liu, Zhipeng Chen, Shijin Wang, Guoping Hu . Arxiv 2016 – 49 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Datasets Interdisciplinary Approaches Model Architecture Neural Machine Translation Variational Autoencoders

Reading comprehension has embraced a booming in recent NLP research. Several institutes have released the Cloze-style reading comprehension data, and these have greatly accelerated the research of machine comprehension. In this work, we firstly present Chinese reading comprehension datasets, which consist of People Daily news dataset and Children’s Fairy Tale (CFT) dataset. Also, we propose a consensus attention-based neural network architecture to tackle the Cloze-style reading comprehension problem, which aims to induce a consensus attention over every words in the query. Experimental results show that the proposed neural network significantly outperforms the state-of-the-art baselines in several public datasets. Furthermore, we setup a baseline for Chinese reading comprehension task, and hopefully this would speed up the process for future research.

Similar Work