论文标题

jax-dips:有限离散方法的神经自举,并应用于不连续性的椭圆形问题

JAX-DIPS: Neural bootstrapping of finite discretization methods and application to elliptic problems with discontinuities

论文作者

Mistani, Pouria, Pakravan, Samira, Ilango, Rajesh, Gibou, Frederic

论文摘要

我们提出了一种基于现有基于网格的数值离散方法的无网格混合神经符号偏微分方程求解器开发的可扩展策略。特别是,该策略可用于通过(i)利用高级数值方法,求解器和预调节器的准确性和收敛性来有效地训练偏微分方程的神经网络替代模型,以及(ii)通过严格限制优化到第一阶自动差异来更好地可伸缩到高阶PDE。提出的神经自举法(特此称为NBM)是基于评估在与神经网络的可训练参数相对于一组随机搭配点的隐式笛卡尔细胞中获得的PDE系统的有限离散残差。重要的是,自举的有限离散方程中存在的保护法律和对称性为神经网络提供了有关培训点本地社区内解决方案规律的信息。我们将NBM应用于三个空间维度的不规则界面的跳跃条件的重要类别的椭圆形问题。我们显示该方法是收敛的,因此模型的准确性通过增加域中的搭配点数量并占据残差而提高。我们显示,NBM与其他PINN类型框架在记忆和训练速度方面具有竞争力。此处介绍的算法使用\ texttt {jax}在名为\ texttt {jax-dips}(https://github.com/jax-dips/jax-dips)的软件包中实现,并代替可区分的界面界面PDE求解器。我们开源\ texttt {jax-dips},以促进使用可区分算法来开发混合PDE求解器的研究。

We present a scalable strategy for development of mesh-free hybrid neuro-symbolic partial differential equation solvers based on existing mesh-based numerical discretization methods. Particularly, this strategy can be used to efficiently train neural network surrogate models of partial differential equations by (i) leveraging the accuracy and convergence properties of advanced numerical methods, solvers, and preconditioners, as well as (ii) better scalability to higher order PDEs by strictly limiting optimization to first order automatic differentiation. The presented neural bootstrapping method (hereby dubbed NBM) is based on evaluation of the finite discretization residuals of the PDE system obtained on implicit Cartesian cells centered on a set of random collocation points with respect to trainable parameters of the neural network. Importantly, the conservation laws and symmetries present in the bootstrapped finite discretization equations inform the neural network about solution regularities within local neighborhoods of training points. We apply NBM to the important class of elliptic problems with jump conditions across irregular interfaces in three spatial dimensions. We show the method is convergent such that model accuracy improves by increasing number of collocation points in the domain and predonditioning the residuals. We show NBM is competitive in terms of memory and training speed with other PINN-type frameworks. The algorithms presented here are implemented using \texttt{JAX} in a software package named \texttt{JAX-DIPS} (https://github.com/JAX-DIPS/JAX-DIPS), standing for differentiable interfacial PDE solver. We open sourced \texttt{JAX-DIPS} to facilitate research into use of differentiable algorithms for developing hybrid PDE solvers.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源