论文标题
Facilesgnet:用于基于事件的触觉对象识别的尖峰图神经网络
TactileSGNet: A Spiking Graph Neural Network for Event-based Tactile Object Recognition
论文作者
论文摘要
触觉感知对于各种机器人任务至关重要。灵活的,以事件为导向的电子皮肤的新进展可能很快就会赋予与人类类似的触摸感知能力。这些电子皮肤对变化(例如在压力,温度下)的变化不同步,并且可以不规则地放在机器人的身体或最终效力器上。但是,这些独特的功能可能会导致当前的深度学习方法,例如不适合触觉学习的卷积功能提取器。在本文中,我们为基于事件的触觉对象识别提出了一个新颖的尖峰图神经网络。为了利用群体的局部连接性,我们提出了几种在图形结构中组织触觉数据的方法。基于构造的图,我们开发了一个尖峰图卷积网络。峰值神经网络的事件驱动的性质使其更适合处理基于事件的数据。两个触觉数据集的实验结果表明,该提出的方法的表现优于其他最先进的尖峰方法,在对各种不同家用物体进行分类时,高精度大约达到90 \%。
Tactile perception is crucial for a variety of robot tasks including grasping and in-hand manipulation. New advances in flexible, event-driven, electronic skins may soon endow robots with touch perception capabilities similar to humans. These electronic skins respond asynchronously to changes (e.g., in pressure, temperature), and can be laid out irregularly on the robot's body or end-effector. However, these unique features may render current deep learning approaches such as convolutional feature extractors unsuitable for tactile learning. In this paper, we propose a novel spiking graph neural network for event-based tactile object recognition. To make use of local connectivity of taxels, we present several methods for organizing the tactile data in a graph structure. Based on the constructed graphs, we develop a spiking graph convolutional network. The event-driven nature of spiking neural network makes it arguably more suitable for processing the event-based data. Experimental results on two tactile datasets show that the proposed method outperforms other state-of-the-art spiking methods, achieving high accuracies of approximately 90\% when classifying a variety of different household objects.