论文标题

内核内存网络:用于内存建模的统一框架

Kernel Memory Networks: A Unifying Framework for Memory Modeling

论文作者

Iatropoulos, Georgios, Brea, Johanni, Gerstner, Wulfram

论文摘要

我们考虑训练神经网络以存储一组具有最大噪声稳健性的模式的问题。从最佳权重和状态更新规则方面,解决方案是通过训练每个单独的神经元来执行内核分类或使用最小重量标准的插值来得出的。通过将此方法应用于前进和经常性网络,我们得出了最佳模型,称为内核存储网络,其中包括在过去几年中提出的许多异性和自动缔合性存储模型,例如现代Hopfield网络和Kanerva的稀疏分布记忆。我们修改了Kanerva的模型,并展示了一种设计内核内存网络的简单方法,该内核存储网络可以存储具有有限吸引力盆地的连续价值模式。内核内存网络的框架提供了一种简单而直观的方式来了解以前的存储器模型的存储能力,并允许从树突非线性和突触交叉词中进行新的生物学解释。

We consider the problem of training a neural network to store a set of patterns with maximal noise robustness. A solution, in terms of optimal weights and state update rules, is derived by training each individual neuron to perform either kernel classification or interpolation with a minimum weight norm. By applying this method to feed-forward and recurrent networks, we derive optimal models, termed kernel memory networks, that include, as special cases, many of the hetero- and auto-associative memory models that have been proposed over the past years, such as modern Hopfield networks and Kanerva's sparse distributed memory. We modify Kanerva's model and demonstrate a simple way to design a kernel memory network that can store an exponential number of continuous-valued patterns with a finite basin of attraction. The framework of kernel memory networks offers a simple and intuitive way to understand the storage capacity of previous memory models, and allows for new biological interpretations in terms of dendritic non-linearities and synaptic cross-talk.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源