Broad-coverage Semantic Parsing As Transduction | Awesome LLM Papers Contribute to Awesome LLM Papers

Broad-coverage Semantic Parsing As Transduction

Sheng Zhang, Xutai Ma, Kevin Duh, Benjamin van Durme . Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP) 2019 – 83 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
EMNLP Uncategorized

We unify different broad-coverage semantic parsing tasks under a transduction paradigm, and propose an attention-based neural framework that incrementally builds a meaning representation via a sequence of semantic relations. By leveraging multiple attention mechanisms, the transducer can be effectively trained without relying on a pre-trained aligner. Experiments conducted on three separate broad-coverage semantic parsing tasks – AMR, SDP and UCCA – demonstrate that our attention-based neural transducer improves the state of the art on both AMR and UCCA, and is competitive with the state of the art on SDP.

Similar Work