论文标题

将张量分解概括为n- ary关系知识库

Generalizing Tensor Decomposition for N-ary Relational Knowledge Bases

论文作者

Liu, Yu, Yao, Quanming, Li, Yong

论文摘要

随着知识库(KBS)的快速发展,链接预测任务(以缺失的事实完成KB)已在特别是二进制关系KBS(又称知识图)中进行了广泛的研究,并具有强大的张量分解相关的方法。但是,具有更高的关系事实的无处不在的n- ary关系KB的关注较少,其中现有基于翻译和基于神经网络的方法在建模各种关系时具有较弱的表达和高复杂性。对于N- ARY关系KB的张量分解尚未考虑,而直接将张量分解的二元关系KBS的张分解方法扩展到N- AR-AR-AR-ARE情况并没有产生令人满意的结果,这是由于指数模型的复杂性及其对二元关系的强烈假设。为了概括N- ARY关系KBS的张量分解,在这项工作中,我们提出了GETD,这是一种基于Tucker分解和张量环分解的广义模型。现有的负面抽样技术也被推广到GETD的N- ARY案例。此外,从理论上讲,我们证明GETD完全表达了完全代表任何KBS。对两个代表性的N- ARY关系KB数据集进行了广泛的评估,这表明GETD的表现出色,将最新方法显着提高了15 \%。此外,GETD进一步在基准二进制关系KB数据集上获得了最新结果。

With the rapid development of knowledge bases (KBs), link prediction task, which completes KBs with missing facts, has been broadly studied in especially binary relational KBs (a.k.a knowledge graph) with powerful tensor decomposition related methods. However, the ubiquitous n-ary relational KBs with higher-arity relational facts are paid less attention, in which existing translation based and neural network based approaches have weak expressiveness and high complexity in modeling various relations. Tensor decomposition has not been considered for n-ary relational KBs, while directly extending tensor decomposition related methods of binary relational KBs to the n-ary case does not yield satisfactory results due to exponential model complexity and their strong assumptions on binary relations. To generalize tensor decomposition for n-ary relational KBs, in this work, we propose GETD, a generalized model based on Tucker decomposition and Tensor Ring decomposition. The existing negative sampling technique is also generalized to the n-ary case for GETD. In addition, we theoretically prove that GETD is fully expressive to completely represent any KBs. Extensive evaluations on two representative n-ary relational KB datasets demonstrate the superior performance of GETD, significantly improving the state-of-the-art methods by over 15\%. Moreover, GETD further obtains the state-of-the-art results on the benchmark binary relational KB datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源