论文标题

关于拉普拉斯和神经切线内核之间的相似性

On the Similarity between the Laplace and Neural Tangent Kernels

论文作者

Geifman, Amnon, Yadav, Abhay, Kasten, Yoni, Galun, Meirav, Jacobs, David, Basri, Ronen

论文摘要

最近的理论工作表明,大规模过度参数化的神经网络等于使用神经切线内核(NTK)的内核回归器。实验表明,这些内核方法的性能与真实的神经网络相似。在这里,我们表明,完全连接的网络的NTK与标准Laplace内核密切相关。从理论上讲,对于超晶的归一化数据,两个核具有相同的本征函数及其特征值以相同的速率衰减,这意味着它们的繁殖核心Hilbert Space(RKHS)包括相同的功能。这意味着两个内核都会产生具有相同平滑度属性的功能类别。两个核因超晶体的数据而有所不同,但是实验表明,当数据正确归一化时,这些差异并不显着。最后,我们提供了比较NTK和Laplace内核以及较大类别的γ-指数内核的实验实验。我们表明这些表现几乎相同。我们的结果表明,可以通过对著名的拉普拉斯内核的分析获得有关神经网络的许多见解,该核心具有简单的封闭形式。

Recent theoretical work has shown that massively overparameterized neural networks are equivalent to kernel regressors that use Neural Tangent Kernels(NTK). Experiments show that these kernel methods perform similarly to real neural networks. Here we show that NTK for fully connected networks is closely related to the standard Laplace kernel. We show theoretically that for normalized data on the hypersphere both kernels have the same eigenfunctions and their eigenvalues decay polynomially at the same rate, implying that their Reproducing Kernel Hilbert Spaces (RKHS) include the same sets of functions. This means that both kernels give rise to classes of functions with the same smoothness properties. The two kernels differ for data off the hypersphere, but experiments indicate that when data is properly normalized these differences are not significant. Finally, we provide experiments on real data comparing NTK and the Laplace kernel, along with a larger class ofγ-exponential kernels. We show that these perform almost identically. Our results suggest that much insight about neural networks can be obtained from analysis of the well-known Laplace kernel, which has a simple closed-form.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源