论文标题

易于不确定性量化(EasyUQ):从单值模型输出中生成预测分布

Easy Uncertainty Quantification (EasyUQ): Generating Predictive Distributions from Single-valued Model Output

论文作者

Walz, Eva-Maria, Henzi, Alexander, Ziegel, Johanna, Gneiting, Tilmann

论文摘要

如果我们最喜欢的计算工具(无论是数值,统计或机器学习方法,还是任何计算机模型),我们如何量化不确定性?在本文中,我们介绍了简单的不确定性量化(EasyUQ)技术,该技术仅基于模型输出结果对的训练数据,将实价模型输出转换为校准的统计分布,而无需访问模型输入。 EasyUQ以其基本形式是最近引入的等渗分布回归(IDR)技术的特殊情况,该技术利用了池粘附 - 侵入算法算法进行非参数等量回归。 EasyUQ产生离散的预测分布,这些分布在有限样品中经过随机单调性,在有限样品中进行了校准和最佳。工作流是完全自动化的,无需进行调整。平滑的EasyUQ方法为IDR提供了内核平滑,以产生连续的预测分布,以保留基本形式的关键特性,包括相对于原始模型输出和渐近一致性,包括随机单调性。为了选择内核参数,我们引入了多个单一的网格搜索,这在计算上要求较低的近似值近似对一式交叉验证。我们使用仿真示例和来自天气预测的预测数据来说明这些技术。在对机器学习的基准问题的研究中,我们展示了如何将简单和光滑的EasyUQ集成到神经网络学习和超参数调整的工作流程中,并发现EasyUQ具有与完美预测的竞争性,以及更详细的基于输入的方法。

How can we quantify uncertainty if our favorite computational tool - be it a numerical, a statistical, or a machine learning approach, or just any computer model - provides single-valued output only? In this article, we introduce the Easy Uncertainty Quantification (EasyUQ) technique, which transforms real-valued model output into calibrated statistical distributions, based solely on training data of model output-outcome pairs, without any need to access model input. In its basic form, EasyUQ is a special case of the recently introduced Isotonic Distributional Regression (IDR) technique that leverages the pool-adjacent-violators algorithm for nonparametric isotonic regression. EasyUQ yields discrete predictive distributions that are calibrated and optimal in finite samples, subject to stochastic monotonicity. The workflow is fully automated, without any need for tuning. The Smooth EasyUQ approach supplements IDR with kernel smoothing, to yield continuous predictive distributions that preserve key properties of the basic form, including both, stochastic monotonicity with respect to the original model output, and asymptotic consistency. For the selection of kernel parameters, we introduce multiple one-fit grid search, a computationally much less demanding approximation to leave-one-out cross-validation. We use simulation examples and forecast data from weather prediction to illustrate the techniques. In a study of benchmark problems from machine learning, we show how EasyUQ and Smooth EasyUQ can be integrated into the workflow of neural network learning and hyperparameter tuning, and find EasyUQ to be competitive with conformal prediction, as well as more elaborate input-based approaches.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源