论文标题

可伸缩图神经网络的异质图

Scalable Graph Neural Networks for Heterogeneous Graphs

论文作者

Yu, Lingfan, Shen, Jiajun, Li, Jinyang, Lerer, Adam

论文摘要

图形神经网络(GNN)是一类流行的参数模型,用于通过图形结构数据学习。最近的工作认为,GNN主要使用该图进行特征平滑,并通过简单地在平滑的节点特征上操作,而不是使用端到端到端学习的功能层次结构来表现出基准任务的竞争结果,这些功能层次结构具有挑战性地扩展到大图。在这项工作中,我们询问这些结果是否可以扩展到异质图,这些图编码了不同实体之间的多种类型的关系。我们提出了有关关系子图(NARS)的邻居平均值,该邻居在邻居平均特征上训练分类器,以获取关系的随机采样子图。我们描述了优化,以允许在训练和推理时间以记忆效率的方式计算这些节点特征。 NARS在几个基准数据集上实现了新的最先准确度,表现优于基于GNN的方法更昂贵

Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data. Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks by simply operating on graph-smoothed node features, rather than using end-to-end learned feature hierarchies that are challenging to scale to large graphs. In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities. We propose Neighbor Averaging over Relation Subgraphs (NARS), which trains a classifier on neighbor-averaged features for randomly-sampled subgraphs of the "metagraph" of relations. We describe optimizations to allow these sets of node features to be computed in a memory-efficient way, both at training and inference time. NARS achieves a new state of the art accuracy on several benchmark datasets, outperforming more expensive GNN-based methods

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源