论文标题

以可能性比例的家庭重新访问Chernoff信息

Revisiting Chernoff Information with Likelihood Ratio Exponential Families

论文作者

Nielsen, Frank

论文摘要

两种概率度量之间的Chernoff信息是统计差异,测量其偏差定义为最大偏斜的Bhattacharyya距离。尽管Chernoff信息最初是为了在统计假设测试中界定贝叶斯错误的界限,但由于其在信息融合到量子信息的应用程序中发现的经验鲁棒性属性,差异发现了许多其他应用程序。从信息理论的角度来看,Chernoff信息也可以解释为Kullback-Leibler Divergence的Minmax对称性。在本文中,我们首先考虑了可测量的Lebesgue空间的两个密度之间的Chernoff信息,考虑了其几何混合物诱导的指数家族:所谓的似然比指数级家庭。其次,我们展示了如何(i)准确地求解任何两个单变量高斯分布之间的Chernoff信息,或使用符号计算获得封闭式公式,(ii)报告了Chernoff的Chernoff信息的封闭形式的公式,具有比例协方差矩阵和(iii)的Chernoff信息,并(III)使用快速的数值分布来近似两个多数分布。

The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally introduced for bounding the Bayes error in statistical hypothesis testing, the divergence found many other applications due to its empirical robustness property found in applications ranging from information fusion to quantum information. From the viewpoint of information theory, the Chernoff information can also be interpreted as a minmax symmetrization of the Kullback--Leibler divergence. In this paper, we first revisit the Chernoff information between two densities of a measurable Lebesgue space by considering the exponential families induced by their geometric mixtures: The so-called likelihood ratio exponential families. Second, we show how to (i) solve exactly the Chernoff information between any two univariate Gaussian distributions or get a closed-form formula using symbolic computing, (ii) report a closed-form formula of the Chernoff information of centered Gaussians with scaled covariance matrices and (iii) use a fast numerical scheme to approximate the Chernoff information between any two multivariate Gaussian distributions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源