论文标题

基于未标记的示例,在分布变化下的保形预测变量的测试时间重新校准

Test-time Recalibration of Conformal Predictors Under Distribution Shift Based on Unlabeled Examples

论文作者

Yilmaz, Fatih Furkan, Heckel, Reinhard

论文摘要

现代图像分类器非常准确,但是预测没有不确定性估计。共形预测因子通过计算一组包含根据分类器的概率估计的用户指定概率的类来提供不确定性估计。为了提供此类集合,共形预测因子通常会根据校准集的概率估计值估计截止阈值。仅当校准集来自与测试集相同的分布时,共形预测因子才能确保可靠性。因此,需要对新分布进行重新校准保形预测因子。但是,实际上,很少有来自新分布的标记数据,从而使校准不可行。在这项工作中,我们考虑了基于未标记示例的新分布的截止阈值的问题。虽然通常不可能根据未标记的示例确保可靠性确保可靠性,但我们提出了一种在自然分配变化下提供出色的不确定性估计值的方法,并且可证明可以为分布移位的特定模型提供工作。

Modern image classifiers are very accurate, but the predictions come without uncertainty estimates. Conformal predictors provide uncertainty estimates by computing a set of classes containing the correct class with a user-specified probability based on the classifier's probability estimates. To provide such sets, conformal predictors often estimate a cutoff threshold for the probability estimates based on a calibration set. Conformal predictors guarantee reliability only when the calibration set is from the same distribution as the test set. Therefore, conformal predictors need to be recalibrated for new distributions. However, in practice, labeled data from new distributions is rarely available, making calibration infeasible. In this work, we consider the problem of predicting the cutoff threshold for a new distribution based on unlabeled examples. While it is impossible in general to guarantee reliability when calibrating based on unlabeled examples, we propose a method that provides excellent uncertainty estimates under natural distribution shifts, and provably works for a specific model of a distribution shift.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源