Generative Language Modeling For Automated Theorem Proving | Awesome LLM Papers Add your paper to Awesome LLM Papers

Generative Language Modeling For Automated Theorem Proving

Stanislas Polu, Ilya Sutskever . Arxiv 2020 – 47 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Compositional Generalization Interdisciplinary Approaches Model Architecture Multimodal Semantic Representation Tools

We explore the application of transformer-based language models to automated theorem proving. This work is motivated by the possibility that a major limitation of automated theorem provers compared to humans – the generation of original mathematical terms – might be addressable via generation from language models. We present an automated prover and proof assistant, GPT-f, for the Metamath formalization language, and analyze its performance. GPT-f found new short proofs that were accepted into the main Metamath library, which is to our knowledge, the first time a deep-learning based system has contributed proofs that were adopted by a formal mathematics community.

Similar Work