论文标题

PPKE:通过基于路径的预训练的知识表示学习

PPKE: Knowledge Representation Learning by Path-based Pre-training

论文作者

He, Bin, Zhou, Di, Xie, Jing, Xiao, Jinghui, Jiang, Xin, Liu, Qun

论文摘要

实体可能在知识图(kg)中具有复杂的交互作用,例如多步关系,可以将其视为实体的图形上下文信息。传统的知识表示学习(KRL)方法通常将单个三倍视为训练单元,而忽略了KGS拓扑结构中的大多数图形上下文信息。在这项研究中,我们提出了一个基于路径的预训练模型,以学习称为PPKE的知识嵌入,该模型旨在将实体之间的更多图形上下文信息整合到KRL模型中。实验表明,我们的模型在几个基准数据集上实现了最新的结果,以链接预测和关系预测任务,这表明我们的模型提供了一种可行的方法来利用kgs中的图形上下文信息。

Entities may have complex interactions in a knowledge graph (KG), such as multi-step relationships, which can be viewed as graph contextual information of the entities. Traditional knowledge representation learning (KRL) methods usually treat a single triple as a training unit, and neglect most of the graph contextual information exists in the topological structure of KGs. In this study, we propose a Path-based Pre-training model to learn Knowledge Embeddings, called PPKE, which aims to integrate more graph contextual information between entities into the KRL model. Experiments demonstrate that our model achieves state-of-the-art results on several benchmark datasets for link prediction and relation prediction tasks, indicating that our model provides a feasible way to take advantage of graph contextual information in KGs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源