论文标题

内核贝叶斯规则中的重要性加权方法

Importance Weighting Approach in Kernel Bayes' Rule

论文作者

Xu, Liyuan, Chen, Yutian, Doucet, Arnaud, Gretton, Arthur

论文摘要

我们通过特征平均值研究了一种非参数计算方法,其中对先前功能的期望进行了更新,以产生预期的内核后验特征,基于学识渊博的神经网或观察值的内核特征的回归。贝叶斯更新中涉及的所有数量都从观察到的数据中学到了完全不含模型的方法。最终的算法是基于重要性加权的新颖实例(KBR)。这会导致对KBR的原始方法具有较高的数值稳定性,而KBR需要倒置。我们使用对无穷大标准中重要性加权估计器的新一致性分析来显示估计器的收敛性。我们评估了KBR关于具有挑战性的合成基准测试的,包括涉及高维图像观测值的状态空间模型的过滤问题。与原始KBR相比,重要性加权KBR的经验性能均匀更好,并且具有其他竞争方法的竞争性能。

We study a nonparametric approach to Bayesian computation via feature means, where the expectation of prior features is updated to yield expected kernel posterior features, based on regression from learned neural net or kernel features of the observations. All quantities involved in the Bayesian update are learned from observed data, making the method entirely model-free. The resulting algorithm is a novel instance of a kernel Bayes' rule (KBR), based on importance weighting. This results in superior numerical stability to the original approach to KBR, which requires operator inversion. We show the convergence of the estimator using a novel consistency analysis on the importance weighting estimator in the infinity norm. We evaluate KBR on challenging synthetic benchmarks, including a filtering problem with a state-space model involving high dimensional image observations. Importance weighted KBR yields uniformly better empirical performance than the original KBR, and competitive performance with other competing methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源