论文标题

神经网络的表示力:打破维度的诅咒

The Representation Power of Neural Networks: Breaking the Curse of Dimensionality

论文作者

Blanchard, Moise, Bennouna, M. Amine

论文摘要

在本文中,我们分析了神经网络需要近似于有限第二种混合衍生物的多元功能的神经元和训练参数的数量 - korobov功能。我们证明了这些数量的上限和深度神经网络的上限,打破了维度的诅咒。我们的界限适用于一般激活功能,包括Relu。我们进一步证明,这些边界几乎与最小的参数数量匹配,任何连续函数近似值都需要近似Korobov函数,这表明神经网络是近乎最佳的函数近似值。

In this paper, we analyze the number of neurons and training parameters that a neural networks needs to approximate multivariate functions of bounded second mixed derivatives -- Korobov functions. We prove upper bounds on these quantities for shallow and deep neural networks, breaking the curse of dimensionality. Our bounds hold for general activation functions, including ReLU. We further prove that these bounds nearly match the minimal number of parameters any continuous function approximator needs to approximate Korobov functions, showing that neural networks are near-optimal function approximators.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源