Addressing The Data Sparsity Issue In Neural AMR Parsing | Awesome LLM Papers Contribute to Awesome LLM Papers

Addressing The Data Sparsity Issue In Neural AMR Parsing

Xiaochang Peng, Chuan Wang, Daniel Gildea, Nianwen Xue . Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers 2017 – 77 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
NAACL Uncategorized

Neural attention models have achieved great success in different NLP tasks. How- ever, they have not fulfilled their promise on the AMR parsing task due to the data sparsity issue. In this paper, we de- scribe a sequence-to-sequence model for AMR parsing and present different ways to tackle the data sparsity problem. We show that our methods achieve significant improvement over a baseline neural atten- tion model and our results are also compet- itive against state-of-the-art systems that do not use extra linguistic resources.

Similar Work