论文标题
共同学习知识嵌入和邻里共识,并与实体一致性的关系知识蒸馏
Jointly Learning Knowledge Embedding and Neighborhood Consensus with Relational Knowledge Distillation for Entity Alignment
论文作者
论文摘要
实体对齐旨在整合来自不同知识图的异质知识。最近的研究采用基于嵌入的方法,首先学习知识图的表示,然后通过测量实体嵌入之间的相似性来执行实体对齐。但是,由于不同的目标嵌入和社区共识引起的权衡问题,他们未能充分利用关系语义信息。为了解决这个问题,我们提出了实体比对(RKDEA)的关系知识蒸馏,这是一个基于图形卷积网络(GCN)的模型,配备了知识蒸馏实体比对。我们采用基于GCN的模型来通过考虑图形结构并通过知识蒸馏将关系语义信息纳入GCN来学习实体的表示。然后,我们引入了一种新颖的自适应机制来传递关系知识,以共同学习实体嵌入和邻里共识。几个基准测试数据集的实验结果证明了我们提出的模型的有效性。
Entity alignment aims at integrating heterogeneous knowledge from different knowledge graphs. Recent studies employ embedding-based methods by first learning the representation of Knowledge Graphs and then performing entity alignment via measuring the similarity between entity embeddings. However, they failed to make good use of the relation semantic information due to the trade-off problem caused by the different objectives of learning knowledge embedding and neighborhood consensus. To address this problem, we propose Relational Knowledge Distillation for Entity Alignment (RKDEA), a Graph Convolutional Network (GCN) based model equipped with knowledge distillation for entity alignment. We adopt GCN-based models to learn the representation of entities by considering the graph structure and incorporating the relation semantic information into GCN via knowledge distillation. Then, we introduce a novel adaptive mechanism to transfer relational knowledge so as to jointly learn entity embedding and neighborhood consensus. Experimental results on several benchmarking datasets demonstrate the effectiveness of our proposed model.