论文标题

无偏的多级蒙特卡洛方法用于顽固性分布:MLMC符合MCMC

Unbiased Multilevel Monte Carlo methods for intractable distributions: MLMC meets MCMC

论文作者

Wang, Guanyang, Wang, Tianze

论文摘要

从马尔可夫链蒙特卡洛(MCMC)输出中构建公正的估计器是一个困难的问题,最近在统计和机器学习社区中引起了很多关注。但是,当前的无偏MCMC框架仅在兴趣数量的期望值时才有效,这不包括许多实际应用。在本文中,我们提出了一种通用方法,用于为期望功能构造公正的估计量,并将其扩展到为嵌套期望构造无偏估计器。我们的方法结合了公正的MCMC和多级Monte Carlo(MLMC)方法。与传统的顺序方法相反,我们的估计器可以在并行处理器上实现。我们表明,我们的估计器具有有限的差异和计算复杂性,并且可以在最佳$ O(1/\ varepsilon^2)$在轻度条件下实现$ \ varepsilon $ - 准确度。我们的数值实验证实了我们的理论发现,并证明了在大规模平行状态下无偏估计器的好处。

Constructing unbiased estimators from Markov chain Monte Carlo (MCMC) outputs is a difficult problem that has recently received a lot of attention in the statistics and machine learning communities. However, the current unbiased MCMC framework only works when the quantity of interest is an expectation, which excludes many practical applications. In this paper, we propose a general method for constructing unbiased estimators for functions of expectations and extend it to construct unbiased estimators for nested expectations. Our approach combines and generalizes the unbiased MCMC and Multilevel Monte Carlo (MLMC) methods. In contrast to traditional sequential methods, our estimator can be implemented on parallel processors. We show that our estimator has a finite variance and computational complexity and can achieve $\varepsilon$-accuracy within the optimal $O(1/\varepsilon^2)$ computational cost under mild conditions. Our numerical experiments confirm our theoretical findings and demonstrate the benefits of unbiased estimators in the massively parallel regime.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源