An Attentive Neural Architecture For Fine-grained Entity Type Classification | Awesome LLM Papers Contribute to Awesome LLM Papers

An Attentive Neural Architecture For Fine-grained Entity Type Classification

Sonse Shimaoka, Pontus Stenetorp, Kentaro Inui, Sebastian Riedel . Proceedings of the 5th Workshop on Automated Knowledge Base Construction 2016 – 95 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Uncategorized

In this work we propose a novel attention-based neural network model for the task of fine-grained entity type classification that unlike previously proposed models recursively composes representations of entity mention contexts. Our model achieves state-of-the-art performance with 74.94% loose micro F1-score on the well-established FIGER dataset, a relative improvement of 2.59%. We also investigate the behavior of the attention mechanism of our model and observe that it can learn contextual linguistic expressions that indicate the fine-grained category memberships of an entity.

Similar Work