论文标题

归纳性逻辑查询在知识图中的回答

Inductive Logical Query Answering in Knowledge Graphs

论文作者

Galkin, Mikhail, Zhu, Zhaocheng, Ren, Hongyu, Tang, Jian

论文摘要

制定和回答逻辑查询是知识图(kgs)的标准通信接口。神经方法减轻了现实世界中臭名昭著的不完整性,在链接预测和复杂的查询回答任务中通过学习实体,关系和查询的复杂查询答案任务取得了令人印象深刻的结果。尽管如此,大多数现有的查询答案方法都依赖于跨传输实体的嵌入,并且不能在不重新验证实体嵌入的情况下推广到包含新实体的kg。在这项工作中,我们研究了归纳查询答案任务,其中推理是在包含有关可见和看不见实体的新实体的图表上进行的。为此,我们设计了利用电感节点和由图神经网络(GNNS)提供动力的电感节点和关系结构表示的两种机制。在实验上,我们表明,归纳模型能够在推理时间执行逻辑推理,而不是看不见的节点概括到比训练大的500%大的图形。在探索效率 - 效率折衷的情况下,我们发现电感关系结构表示方法通常可以实现更高的性能,而电感节点表示方法可以在推理方案中回答复杂的查询,而无需对查询和尺度的任何培训介绍到数百万节点的图表。代码可从https://github.com/deepgraphlearning/inductiveqe获得。

Formulating and answering logical queries is a standard communication interface for knowledge graphs (KGs). Alleviating the notorious incompleteness of real-world KGs, neural methods achieved impressive results in link prediction and complex query answering tasks by learning representations of entities, relations, and queries. Still, most existing query answering methods rely on transductive entity embeddings and cannot generalize to KGs containing new entities without retraining the entity embeddings. In this work, we study the inductive query answering task where inference is performed on a graph containing new entities with queries over both seen and unseen entities. To this end, we devise two mechanisms leveraging inductive node and relational structure representations powered by graph neural networks (GNNs). Experimentally, we show that inductive models are able to perform logical reasoning at inference time over unseen nodes generalizing to graphs up to 500% larger than training ones. Exploring the efficiency--effectiveness trade-off, we find the inductive relational structure representation method generally achieves higher performance, while the inductive node representation method is able to answer complex queries in the inference-only regime without any training on queries and scales to graphs of millions of nodes. Code is available at https://github.com/DeepGraphLearning/InductiveQE.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源