Improving Coreference Resolution By Learning Entity-level Distributed Representations | Awesome LLM Papers Add your paper to Awesome LLM Papers

Improving Coreference Resolution By Learning Entity-level Distributed Representations

Kevin Clark, Christopher D. Manning . Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2016 – 51 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL

A long-standing challenge in coreference resolution has been the incorporation of entity-level information - features defined over clusters of mentions instead of mention pairs. We present a neural network based coreference system that produces high-dimensional vector representations for pairs of coreference clusters. Using these representations, our system learns when combining clusters is desirable. We train the system with a learning-to-search algorithm that teaches it which local decisions (cluster merges) will lead to a high-scoring final coreference partition. The system substantially outperforms the current state-of-the-art on the English and Chinese portions of the CoNLL 2012 Shared Task dataset despite using few hand-engineered features.

Similar Work