论文标题
通过知识图赋予语言模型,以回答问题
Empowering Language Models with Knowledge Graph Reasoning for Question Answering
论文作者
论文摘要
回答开放域问题需要世界上有关内在实体的知识。由于预训练的语言模型(LMS)缺乏存储所有必需知识的能力,因此外部知识源(例如知识图)通常用于增强LMS。在这项工作中,我们提出了知识推理授权语言模型(OREO-LM),该模型由一个新颖的知识交互层组成,可以灵活地插入现有的基于变压器的LMS中,以协作与可区分的知识图形推理模块进行协作。通过这种方式,LM引导KG走向所需的答案,而检索的知识则改善了LM。通过对Roberta和T5采用Oreo-LM,我们表现出巨大的性能增长,从而在闭幕环境中实现了最先进的结果。绩效提高主要来自KG推理推断缺失的关系事实的能力。此外,Oreo-LM提供了推理路径作为解释模型决定的理由。
Answering open-domain questions requires world knowledge about in-context entities. As pre-trained Language Models (LMs) lack the power to store all required knowledge, external knowledge sources, such as knowledge graphs, are often used to augment LMs. In this work, we propose knOwledge REasOning empowered Language Model (OREO-LM), which consists of a novel Knowledge Interaction Layer that can be flexibly plugged into existing Transformer-based LMs to interact with a differentiable Knowledge Graph Reasoning module collaboratively. In this way, LM guides KG to walk towards the desired answer, while the retrieved knowledge improves LM. By adopting OREO-LM to RoBERTa and T5, we show significant performance gain, achieving state-of-art results in the Closed-Book setting. The performance enhancement is mainly from the KG reasoning's capacity to infer missing relational facts. In addition, OREO-LM provides reasoning paths as rationales to interpret the model's decision.