Multilingual Generative Language Models For Zero-shot Cross-lingual Event Argument Extraction | Awesome LLM Papers Add your paper to Awesome LLM Papers

Multilingual Generative Language Models For Zero-shot Cross-lingual Event Argument Extraction

Kuan-Hao Huang, I-Hung Hsu, Premkumar Natarajan, Kai-Wei Chang, Nanyun Peng . Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2022 – 43 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL Compositional Generalization Interdisciplinary Approaches Multimodal Semantic Representation Neural Machine Translation RAG

We present a study on leveraging multilingual pre-trained generative language models for zero-shot cross-lingual event argument extraction (EAE). By formulating EAE as a language generation task, our method effectively encodes event structures and captures the dependencies between arguments. We design language-agnostic templates to represent the event argument structures, which are compatible with any language, hence facilitating the cross-lingual transfer. Our proposed model finetunes multilingual pre-trained generative language models to generate sentences that fill in the language-agnostic template with arguments extracted from the input passage. The model is trained on source languages and is then directly applied to target languages for event argument extraction. Experiments demonstrate that the proposed model outperforms the current state-of-the-art models on zero-shot cross-lingual EAE. Comprehensive studies and error analyses are presented to better understand the advantages and the current limitations of using generative language models for zero-shot cross-lingual transfer EAE.

Similar Work