论文标题

神经网络通过逻辑知识增强

Neural Networks Enhancement with Logical Knowledge

论文作者

Daniele, Alessandro, Serafini, Luciano

论文摘要

在最近的过去,人们对神经符号融合框架的兴趣越来越大,即集成连接主义者和符号方法以获得两全其美的混合系统。在先前的工作中,我们提出了Kenn(知识增强的神经网络),这是一种神经符号结构,通过添加新的最终层将先前的逻辑知识注入神经网络,从而将初始预测相应地根据知识修改。在此策略的优点中,有条款权重包含代表条款强度的可学习参数,这意味着模型可以学习每个条款对最终预测的影响。作为一种特殊情况,如果培训数据与约束矛盾,肯恩学会了忽略它,从而使系统强大地了解了错误的知识。在本文中,我们提出了kenn的扩展,以进行关系数据。为了评估这一新扩展名,我们对Citeseer上的不同学习配置进行了测试,Citeseer是一种用于集体分类的标准数据集。结果表明,即使在存在关系数据的情况下,Kenn也能够提高基本神经网络的性能,从而超过了将学习与逻辑结合的其他两种值得注意的方法。

In the recent past, there has been a growing interest in Neural-Symbolic Integration frameworks, i.e., hybrid systems that integrate connectionist and symbolic approaches to obtain the best of both worlds. In a previous work, we proposed KENN (Knowledge Enhanced Neural Networks), a Neural-Symbolic architecture that injects prior logical knowledge into a neural network by adding a new final layer which modifies the initial predictions accordingly to the knowledge. Among the advantages of this strategy, there is the inclusion of clause weights, learnable parameters that represent the strength of the clauses, meaning that the model can learn the impact of each clause on the final predictions. As a special case, if the training data contradicts a constraint, KENN learns to ignore it, making the system robust to the presence of wrong knowledge. In this paper, we propose an extension of KENN for relational data. To evaluate this new extension, we tested it with different learning configurations on Citeseer, a standard dataset for Collective Classification. The results show that KENN is capable of increasing the performances of the underlying neural network even in the presence relational data, outperforming other two notable methods that combine learning with logic.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源