Solving ESL Sentence Completion Questions Via Pre-trained Neural Language Models | Awesome LLM Papers Add your paper to Awesome LLM Papers

Solving ESL Sentence Completion Questions Via Pre-trained Neural Language Models

Qiongqiong Liu, Tianqiao Liu, Jiafu Zhao, Qiang Fang, Wenbiao Ding, Zhongqin Wu, Feng Xia, Jiliang Tang, Zitao Liu . Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021 – 48 citations

[Code] [Paper]   Search on Google Scholar   Search on Semantic Scholar
Compositional Generalization Datasets EMNLP Has Code Interdisciplinary Approaches Multimodal Semantic Representation Tools

Sentence completion (SC) questions present a sentence with one or more blanks that need to be filled in, three to five possible words or phrases as options. SC questions are widely used for students learning English as a Second Language (ESL) and building computational approaches to automatically solve such questions is beneficial to language learners. In this work, we propose a neural framework to solve SC questions in English examinations by utilizing pre-trained language models. We conduct extensive experiments on a real-world K-12 ESL SC question dataset and the results demonstrate the superiority of our model in terms of prediction accuracy. Furthermore, we run precision-recall trade-off analysis to discuss the practical issues when deploying it in real-life scenarios. To encourage reproducible results, we make our code publicly available at https://github.com/AIED2021/ESL-SentenceCompletion.

Similar Work