论文标题
神经SDE的确定性近似
A Deterministic Approximation to Neural SDEs
论文作者
论文摘要
神经随机微分方程(NSDE)模拟随机过程作为神经网络的漂移和扩散函数。虽然已知NSD可以进行准确的预测,但到目前为止,其不确定性定量属性仍未探索。我们报告的经验发现是,从NSDE获得良好的不确定性估计是计算上的过敏性。作为一种补救措施,我们开发了一种计算负担得起的确定性方案,该方案在动力学受NSD管辖时准确地近似过渡内核。我们的方法引入了匹配算法的二维力矩:沿着神经净层和沿时间方向水平的垂直力,这受益于有效近似的原始组合。我们对过渡内核的确定性近似适用于培训和预测。我们在多个实验中观察到,我们方法的不确定性校准质量仅在引入高计算成本后才通过蒙特卡洛采样来匹配。由于确定性培训的数值稳定性,我们的方法还提高了预测准确性。
Neural Stochastic Differential Equations (NSDEs) model the drift and diffusion functions of a stochastic process as neural networks. While NSDEs are known to make accurate predictions, their uncertainty quantification properties have been remained unexplored so far. We report the empirical finding that obtaining well-calibrated uncertainty estimations from NSDEs is computationally prohibitive. As a remedy, we develop a computationally affordable deterministic scheme which accurately approximates the transition kernel, when dynamics is governed by a NSDE. Our method introduces a bidimensional moment matching algorithm: vertical along the neural net layers and horizontal along the time direction, which benefits from an original combination of effective approximations. Our deterministic approximation of the transition kernel is applicable to both training and prediction. We observe in multiple experiments that the uncertainty calibration quality of our method can be matched by Monte Carlo sampling only after introducing high computational cost. Thanks to the numerical stability of deterministic training, our method also improves prediction accuracy.