Natural Language To Structured Query Generation Via Meta-learning | Awesome LLM Papers Contribute to Awesome LLM Papers

Natural Language To Structured Query Generation Via Meta-learning

Po-Sen Huang, Chenglong Wang, Rishabh Singh, Wen-Tau Yih, Xiaodong He . Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers) 2018 – 125 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
NAACL Uncategorized

In conventional supervised training, a model is trained to fit all the training examples. However, having a monolithic model may not always be the best strategy, as examples could vary widely. In this work, we explore a different learning protocol that treats each example as a unique pseudo-task, by reducing the original learning problem to a few-shot meta-learning scenario with the help of a domain-dependent relevance function. When evaluated on the WikiSQL dataset, our approach leads to faster convergence and achieves 1.1%-5.4% absolute accuracy gains over the non-meta-learning counterparts.

Similar Work