论文标题

推断过去的人类行动在绑架推理的房屋中

Inferring Past Human Actions in Homes with Abductive Reasoning

论文作者

Tan, Clement, Yeo, Chai Kiat, Tan, Cheston, Fernando, Basura

论文摘要

绑架推理旨在使一组不完整的观察结果最有可能推断。在本文中,我们介绍了“绑架了过去的动作推论”,这是一项新的研究任务,旨在确定房屋内部在家庭中采取的过去动作,以使用绑架性推理,以达到单个图像中捕获的特定状态。该研究探讨了三个关键的绑架推理问题:过去的动作设置预测,过去的动作序列预测以及绑架了过去的动作验证。我们介绍了针对绑架的过去动作推断量身定制的几种模型,包括关系图神经网络,关系双线性池模型和关系变压器模型。值得注意的是,新提出的对象相关双线性图编码器(BIGED)模型在所有评估的方法中最有效,这表明在处理动作基因组数据集的复杂性方面表现出了良好的熟练程度。这项研究的贡献大大提高了深度学习模型来推理当前场景证据的能力,并对过去的人类行为提出了高度合理的推论。这一进步使人们可以更深入地了解事件和行为,从而可以增强决策并提高各种现实世界应用的系统能力,例如人类机器人互动,老年人护理和健康监测。 https://github.com/lunaproject22/aar​​可用的代码和数据

Abductive reasoning aims to make the most likely inference for a given set of incomplete observations. In this paper, we introduce "Abductive Past Action Inference", a novel research task aimed at identifying the past actions performed by individuals within homes to reach specific states captured in a single image, using abductive inference. The research explores three key abductive inference problems: past action set prediction, past action sequence prediction, and abductive past action verification. We introduce several models tailored for abductive past action inference, including a relational graph neural network, a relational bilinear pooling model, and a relational transformer model. Notably, the newly proposed object-relational bilinear graph encoder-decoder (BiGED) model emerges as the most effective among all methods evaluated, demonstrating good proficiency in handling the intricacies of the Action Genome dataset. The contributions of this research significantly advance the ability of deep learning models to reason about current scene evidence and make highly plausible inferences about past human actions. This advancement enables a deeper understanding of events and behaviors, which can enhance decision-making and improve system capabilities across various real-world applications such as Human-Robot Interaction and Elderly Care and Health Monitoring. Code and data available at https://github.com/LUNAProject22/AAR

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源