论文标题

一维恢复神经网络模型的样条表示和冗余

Spline Representation and Redundancies of One-Dimensional ReLU Neural Network Models

论文作者

Plonka, Gerlind, Riebe, Yannick, Kolomoitsev, Yurii

论文摘要

与具有任意结的连续分段线性(CPL)样条函数相比,我们分析了一维深度恢复神经网络(Relu DNN)的结构。特别是,我们给出了一种递归算法,以将确定relu dnn的参数集传输到CPL条型函数的参数集中。使用此表示形式,我们表明,在删除了由正缩放属性引起的Relu DNN的众所周知的参数冗余之后,所有其余参数都是独立的。此外,我们证明具有一个,两个或三个隐藏层的relu dnn可以用$ k $的开头(断点)代表CPL条条函数,其中$ k $是确定标准化的relu dnn的真实参数的数量(最高输出层参数)。我们的发现对于修复Relu DNN的先验条件很有用,以实现具有规定的断点和功能值的输出。

We analyze the structure of a one-dimensional deep ReLU neural network (ReLU DNN) in comparison to the model of continuous piecewise linear (CPL) spline functions with arbitrary knots. In particular, we give a recursive algorithm to transfer the parameter set determining the ReLU DNN into the parameter set of a CPL spline function. Using this representation, we show that after removing the well-known parameter redundancies of the ReLU DNN, which are caused by the positive scaling property, all remaining parameters are independent. Moreover, we show that the ReLU DNN with one, two or three hidden layers can represent CPL spline functions with $K$ arbitrarily prescribed knots (breakpoints), where $K$ is the number of real parameters determining the normalized ReLU DNN (up to the output layer parameters). Our findings are useful to fix a priori conditions on the ReLU DNN to achieve an output with prescribed breakpoints and function values.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源