Integrating Graph Contextualized Knowledge Into Pre-trained Language Models | Awesome LLM Papers Add your paper to Awesome LLM Papers

Integrating Graph Contextualized Knowledge Into Pre-trained Language Models

Bin He, di Zhou, Jinghui Xiao, Xin Jiang, Qun Liu, Nicholas Jing Yuan, Tong Xu . Arxiv 2019 – 41 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Scalability

Complex node interactions are common in knowledge graphs, and these interactions also contain rich knowledge information. However, traditional methods usually treat a triple as a training unit during the knowledge representation learning (KRL) procedure, neglecting contextualized information of the nodes in knowledge graphs (KGs). We generalize the modeling object to a very general form, which theoretically supports any subgraph extracted from the knowledge graph, and these subgraphs are fed into a novel transformer-based model to learn the knowledge embeddings. To broaden usage scenarios of knowledge, pre-trained language models are utilized to build a model that incorporates the learned knowledge representations. Experimental results demonstrate that our model achieves the state-of-the-art performance on several medical NLP tasks, and improvement above TransE indicates that our KRL method captures the graph contextualized information effectively.

Similar Work