论文标题

使用Grassmann层的浅层Relu网络减少订单建模

Reduced Order Modeling using Shallow ReLU Networks with Grassmann Layers

论文作者

Bollinger, Kayla, Schaeffer, Hayden

论文摘要

本文使用结构化神经网络为方程系统提供了一种非线性模型减少方法。神经网络采用了一个“三层”网络的形式,其第一层被限制在Grassmann歧管上,并且将第一个激活函数设置为身份,而其余网络则是标准的两层relu神经网络。 Grassmann层确定输入空间的减少基础,而其余层则近似于非线性输入输出系统。训练在学习减少基础和非线性近似之间交替,并且证明比固定减少基础和仅训练网络更有效。这种方法的另一个好处是,对于位于低维子空间上的数据,网络中的参数数不需要大。我们表明,我们的方法可以应用于数据扫描制度中的科学问题,该问题通常不适合神经网络近似。例如,非线性动力学系统和几个航空工程问题的订单建模减少。

This paper presents a nonlinear model reduction method for systems of equations using a structured neural network. The neural network takes the form of a "three-layer" network with the first layer constrained to lie on the Grassmann manifold and the first activation function set to identity, while the remaining network is a standard two-layer ReLU neural network. The Grassmann layer determines the reduced basis for the input space, while the remaining layers approximate the nonlinear input-output system. The training alternates between learning the reduced basis and the nonlinear approximation, and is shown to be more effective than fixing the reduced basis and training the network only. An additional benefit of this approach is, for data that lie on low-dimensional subspaces, that the number of parameters in the network does not need to be large. We show that our method can be applied to scientific problems in the data-scarce regime, which is typically not well-suited for neural network approximations. Examples include reduced order modeling for nonlinear dynamical systems and several aerospace engineering problems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源