论文标题
多分辨率的部分微分方程保留了时空动力学的学习框架
Multi-resolution partial differential equations preserved learning framework for spatiotemporal dynamics
论文作者
论文摘要
传统数据驱动的深度学习模型通常会在高训练成本,错误积累和在复杂的物理过程中的普遍性差而挣扎。物理知识深度学习(PIDL)通过将物理原理纳入模型来解决这些挑战。大多数PIDL通过将管辖方程嵌入到损失函数中来正规训练,但这在很大程度上取决于广泛的超参数调整以权衡每个损失术语。为此,我们建议通过``烘烤''通过偏微分方程(PDE)操作员和网络结构之间的连接到神经网络体系结构中,以``烘烤''来利用物理学的先验知识,从而导致PDE提供的保留的神经网络(PPNN)。这种方法通过在多分辨率设置中通过卷积残差网络嵌入离散的PDE,在很大程度上可以提高普遍性和长期预测准确性,超过传统的黑盒模型。在各种时空动力系统中,已证明了所提出方法的有效性和优点,该系统由时空PDES,包括反应扩散,汉堡和Navier-Stokes方程。
Traditional data-driven deep learning models often struggle with high training costs, error accumulation, and poor generalizability in complex physical processes. Physics-informed deep learning (PiDL) addresses these challenges by incorporating physical principles into the model. Most PiDL approaches regularize training by embedding governing equations into the loss function, yet this depends heavily on extensive hyperparameter tuning to weigh each loss term. To this end, we propose to leverage physics prior knowledge by ``baking'' the discretized governing equations into the neural network architecture via the connection between the partial differential equations (PDE) operators and network structures, resulting in a PDE-preserved neural network (PPNN). This method, embedding discretized PDEs through convolutional residual networks in a multi-resolution setting, largely improves the generalizability and long-term prediction accuracy, outperforming conventional black-box models. The effectiveness and merit of the proposed methods have been demonstrated across various spatiotemporal dynamical systems governed by spatiotemporal PDEs, including reaction-diffusion, Burgers', and Navier-Stokes equations.