Retrieval-augmented Generative Question Answering For Event Argument Extraction · Awesome LLM Papers Contribute to LLM-Bible

Retrieval-augmented Generative Question Answering For Event Argument Extraction

Xinya Du, Heng Ji. Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing 2022 – 17 citations

[Paper]    
Few-Shot Prompting RAG

Event argument extraction has long been studied as a sequential prediction problem with extractive-based methods, tackling each argument in isolation. Although recent work proposes generation-based methods to capture cross-argument dependency, they require generating and post-processing a complicated target sequence (template). Motivated by these observations and recent pretrained language models’ capabilities of learning from demonstrations. We propose a retrieval-augmented generative QA model (R-GQA) for event argument extraction. It retrieves the most similar QA pair and augments it as prompt to the current example’s context, then decodes the arguments as answers. Our approach outperforms substantially prior methods across various settings (i.e. fully supervised, domain transfer, and fewshot learning). Finally, we propose a clustering-based sampling strategy (JointEnc) and conduct a thorough analysis of how different strategies influence the few-shot learning performance. The implementations are available at https:// github.com/xinyadu/RGQA

Similar Work