论文标题

与知识图专家的混合在一起,以共同点推理的内容生成多样化

Diversifying Content Generation for Commonsense Reasoning with Mixture of Knowledge Graph Experts

论文作者

Yu, Wenhao, Zhu, Chenguang, Qin, Lianhui, Zhang, Zhihan, Zhao, Tong, Jiang, Meng

论文摘要

自然语言中的生成常识性推理(GCR)是在产生连贯的文本的同时推理常识性。近年来,人们对改善常识性推理任务的发电质量的兴趣激增。然而,这些方法很少研究GCR任务中的多样性,该任务旨在为现实情况产生替代解释或预测所有可能的结果。多样化的GCR具有挑战性,因为它希望产生多个输出,这些输出不仅在语义上不同,而且基于常识性知识。在本文中,我们提出了Mokge,这是一种新颖的方法,可以通过专家(MOE)策略在常识知识图(KG)上的混合(MOE)策略来多样化。一组知识专家寻求各种推理,以鼓励各种产量。经验实验表明,基于自动评估和人类评估,Mokge可以显着改善多样性,同时在两个GCR基准上的准确性上实现PAR性能。

Generative commonsense reasoning (GCR) in natural language is to reason about the commonsense while generating coherent text. Recent years have seen a surge of interest in improving the generation quality of commonsense reasoning tasks. Nevertheless, these approaches have seldom investigated diversity in the GCR tasks, which aims to generate alternative explanations for a real-world situation or predict all possible outcomes. Diversifying GCR is challenging as it expects to generate multiple outputs that are not only semantically different but also grounded in commonsense knowledge. In this paper, we propose MoKGE, a novel method that diversifies the generative reasoning by a mixture of expert (MoE) strategy on commonsense knowledge graphs (KG). A set of knowledge experts seek diverse reasoning on KG to encourage various generation outputs. Empirical experiments demonstrated that MoKGE can significantly improve the diversity while achieving on par performance on accuracy on two GCR benchmarks, based on both automatic and human evaluations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源