Language To Logical Form With Neural Attention | Awesome LLM Papers Contribute to Awesome LLM Papers

Language To Logical Form With Neural Attention

Li Dong, Mirella Lapata . Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2016 – 667 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL Uncategorized

Semantic parsing aims at mapping natural language to machine interpretable meaning representations. Traditional approaches rely on high-quality lexicons, manually-built templates, and linguistic features which are either domain- or representation-specific. In this paper we present a general method based on an attention-enhanced encoder-decoder model. We encode input utterances into vector representations, and generate their logical forms by conditioning the output sequences or trees on the encoding vectors. Experimental results on four datasets show that our approach performs competitively without using hand-engineered features and is easy to adapt across domains and meaning representations.

Similar Work