论文标题

Shap-XRT:Shapley值符合条件独立性测试

SHAP-XRT: The Shapley Value Meets Conditional Independence Testing

论文作者

Teneggi, Jacopo, Bharti, Beepul, Romano, Yaniv, Sulam, Jeremias

论文摘要

人工神经网络的复杂性质引起了人们对现实情况下的可靠性,可信度和公平性的关注。 Shapley Value(来自游戏理论的解决方案概念)是机器学习模型最受欢迎的解释方法之一。从传统上讲,从统计的角度来看,特征重要性是根据有条件独立性定义的。到目前为止,这两种可解释性和特征重要性的方法被认为是独立的和独特的。在这项工作中,我们表明基于沙普利的解释方法和条件独立性测试密切相关。我们介绍了Shapley解释随机测试(SHAP-XRT),这是一种受条件随机测试(CRT)启发的测试程序,该测试是针对条件独立性的局部特定概念(即在样本上)的特定概念。有了它,我们证明,对于二进制分类问题,沙普利价值的边际贡献为其各自测试的预期$ p $值提供了下限和上限。此外,我们表明,沙普利价值本身为全球(即总体)无效假设的预期$ p $值提供了上限。结果,我们从新颖的角度进一步理解了对基于沙普利的解释方法的理解,并表征了人们可以通过Shapley值对特征重要性提出统计有效主张的条件。

The complex nature of artificial neural networks raises concerns on their reliability, trustworthiness, and fairness in real-world scenarios. The Shapley value -- a solution concept from game theory -- is one of the most popular explanation methods for machine learning models. More traditionally, from a statistical perspective, feature importance is defined in terms of conditional independence. So far, these two approaches to interpretability and feature importance have been considered separate and distinct. In this work, we show that Shapley-based explanation methods and conditional independence testing are closely related. We introduce the SHAPley EXplanation Randomization Test (SHAP-XRT), a testing procedure inspired by the Conditional Randomization Test (CRT) for a specific notion of local (i.e., on a sample) conditional independence. With it, we prove that for binary classification problems, the marginal contributions in the Shapley value provide lower and upper bounds to the expected $p$-values of their respective tests. Furthermore, we show that the Shapley value itself provides an upper bound to the expected $p$-value of a global (i.e., overall) null hypothesis. As a result, we further our understanding of Shapley-based explanation methods from a novel perspective and characterize the conditions under which one can make statistically valid claims about feature importance via the Shapley value.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源