论文标题

随机施坦的差异

Stochastic Stein Discrepancies

论文作者

Gorham, Jackson, Raj, Anant, Mackey, Lester

论文摘要

当精确整合和采样非常棘手时,Stein差异(SDS)在近似推断中监测收敛性和非连接性。但是,如果Stein运算符(通常超过可能性或电势)的总和是昂贵的,那么STEIN差异的计算可能会过时。为了解决这种不足,我们表明基于Stein操作员的子采样近似的随机Stein差异(SSD)继承了标准SDS的收敛控制属性,概率为1。在途中,我们确定了stein变化梯度下降(SVGD)在无结合的域上的融合。在我们进行有偏见的马尔可夫链蒙特卡洛(MCMC)高参数调整,近似MCMC采样器选择和随机SVGD的实验中,SSD可对标准SDS的可比推断,其可能性评估的数量级较少。

Stein discrepancies (SDs) monitor convergence and non-convergence in approximate inference when exact integration and sampling are intractable. However, the computation of a Stein discrepancy can be prohibitive if the Stein operator - often a sum over likelihood terms or potentials - is expensive to evaluate. To address this deficiency, we show that stochastic Stein discrepancies (SSDs) based on subsampled approximations of the Stein operator inherit the convergence control properties of standard SDs with probability 1. Along the way, we establish the convergence of Stein variational gradient descent (SVGD) on unbounded domains, resolving an open question of Liu (2017). In our experiments with biased Markov chain Monte Carlo (MCMC) hyperparameter tuning, approximate MCMC sampler selection, and stochastic SVGD, SSDs deliver comparable inferences to standard SDs with orders of magnitude fewer likelihood evaluations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源