论文标题

稳定的神经常规微分方程,以长期预测动态系统

Stabilized Neural Ordinary Differential Equations for Long-Time Forecasting of Dynamical Systems

论文作者

Linot, Alec J., Burby, Joshua W., Tang, Qi, Balaprakash, Prasanna, Graham, Michael D., Maulik, Romit

论文摘要

在数据驱动的时空现象的建模中,通常需要在捕获高波数的动力学时进行仔细考虑。当感兴趣的系统表现出冲击或混乱的动力学时,这个问题变得尤其具有挑战性。我们提出了一种数据驱动的建模方法,该方法通过提出一种新型的结构,即稳定的神经普通微分方程(ODE)来准确捕获冲击和混乱的动力学。在我们提出的架构中,我们通过将两个NN的输出添加在一起,其中一个人学习线性术语,而另一个非线性术语来学习ode的右侧(RHS)。具体而言,我们通过训练稀疏的线性卷积NN来学习线性术语和密集的完全连接的非线性NN来学习非线性术语。这与标准神经ode相反,后者仅培训一个用于学习RHS的单个NN。我们将此设置应用于粘性汉堡方程,该方程式表现出令人震惊的行为,并且比标准神经颂歌表现出更好的短期跟踪和能量谱的预测。我们还发现,稳定的神经ODE模型比标准的神经ODE方法更适合嘈杂的初始条件。我们还将这种方法应用于Kuramoto-Sivashinsky方程的混乱轨迹。在这种情况下,稳定的神经ODES可以在吸引子上长期轨迹,并且对嘈杂的初始条件非常强大,而标准神经ODE则无法实现这两种结果。我们通过证明稳定神经ODE如何通过将动力学投影到学习线性项的特征向量来提供自然扩展来结束。

In data-driven modeling of spatiotemporal phenomena careful consideration often needs to be made in capturing the dynamics of the high wavenumbers. This problem becomes especially challenging when the system of interest exhibits shocks or chaotic dynamics. We present a data-driven modeling method that accurately captures shocks and chaotic dynamics by proposing a novel architecture, stabilized neural ordinary differential equation (ODE). In our proposed architecture, we learn the right-hand-side (RHS) of an ODE by adding the outputs of two NN together where one learns a linear term and the other a nonlinear term. Specifically, we implement this by training a sparse linear convolutional NN to learn the linear term and a dense fully-connected nonlinear NN to learn the nonlinear term. This is in contrast with the standard neural ODE which involves training only a single NN for learning the RHS. We apply this setup to the viscous Burgers equation, which exhibits shocked behavior, and show better short-time tracking and prediction of the energy spectrum at high wavenumbers than a standard neural ODE. We also find that the stabilized neural ODE models are much more robust to noisy initial conditions than the standard neural ODE approach. We also apply this method to chaotic trajectories of the Kuramoto-Sivashinsky equation. In this case, stabilized neural ODEs keep long-time trajectories on the attractor, and are highly robust to noisy initial conditions, while standard neural ODEs fail at achieving either of these results. We conclude by demonstrating how stabilizing neural ODEs provide a natural extension for use in reduced-order modeling by projecting the dynamics onto the eigenvectors of the learned linear term.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源