Span-based Joint Entity And Relation Extraction With Transformer Pre-training | Awesome LLM Papers Contribute to Awesome LLM Papers

Span-based Joint Entity And Relation Extraction With Transformer Pre-training

Markus Eberts, Adrian Ulges . Arxiv 2019 – 152 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Uncategorized

We introduce SpERT, an attention model for span-based joint entity and relation extraction. Our key contribution is a light-weight reasoning on BERT embeddings, which features entity recognition and filtering, as well as relation classification with a localized, marker-free context representation. The model is trained using strong within-sentence negative samples, which are efficiently extracted in a single BERT pass. These aspects facilitate a search over all spans in the sentence. In ablation studies, we demonstrate the benefits of pre-training, strong negative sampling and localized context. Our model outperforms prior work by up to 2.6% F1 score on several datasets for joint entity and relation extraction.

Similar Work