论文标题

神经网络的二元性通过重现内核BANACH空间

Duality for Neural Networks through Reproducing Kernel Banach Spaces

论文作者

Spek, Len, Heeringa, Tjeerd Jan, Schwenninger, Felix, Brune, Christoph

论文摘要

在机器学习的各个领域,复制核Hilbert Spaces(RKHS)已成为非常成功的工具。最近,Barron空间已被用来证明神经网络的概括误差的界限。不幸的是,由于权重的强非线性耦合,无法用RKH来理解Barron空间。可以使用更通用的繁殖核Banach空间(RKB)来解决这。我们证明这些巴伦空间属于一类积分RKB。该课程也可以理解为RKHS空间的无限结合。此外,我们证明了此类RKBS的双重空间再次是RKBS,其中数据和参数的作用互换,形成了一个伴随的RKBS对,包括复制的内核。这使我们能够为神经网络构建鞍点问题,该问题可用于整个原始偶偶优化领域。

Reproducing Kernel Hilbert spaces (RKHS) have been a very successful tool in various areas of machine learning. Recently, Barron spaces have been used to prove bounds on the generalisation error for neural networks. Unfortunately, Barron spaces cannot be understood in terms of RKHS due to the strong nonlinear coupling of the weights. This can be solved by using the more general Reproducing Kernel Banach spaces (RKBS). We show that these Barron spaces belong to a class of integral RKBS. This class can also be understood as an infinite union of RKHS spaces. Furthermore, we show that the dual space of such RKBSs, is again an RKBS where the roles of the data and parameters are interchanged, forming an adjoint pair of RKBSs including a reproducing kernel. This allows us to construct the saddle point problem for neural networks, which can be used in the whole field of primal-dual optimisation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源