论文标题

经验贝叶斯后验分布的收敛速率:变分的透视图

Convergence Rates of Empirical Bayes Posterior Distributions: A Variational Perspective

论文作者

Zhang, Fengshuo, Gao, Chao

论文摘要

我们研究了非参数和高维推断的经验贝叶斯后分布的收敛速率。我们表明,只要超参数集是离散的,最大边缘可能性估计量引起的经验贝叶斯后验分布就可以被视为层次贝叶斯后验分布的变异近似。经验贝叶斯和变异贝叶斯之间的这种联系使我们能够利用变异贝叶斯文献的最新结果,并直接从变异的角度直接获得经验贝叶斯后分布的收敛速率。对于不一定是离散的更通用的超级参数集,我们引入了一种称为“先前分解”的新技术,以处理可以写入概率指标的凸组合的先前分布,其支持是低维子空间的概率度量。这导致了经验贝叶斯收敛速率的经典“先前质量和测试”条件的普遍版本。我们的理论应用于许多统计估计问题,包括非参数密度估计和稀疏线性回归。

We study the convergence rates of empirical Bayes posterior distributions for nonparametric and high-dimensional inference. We show that as long as the hyperparameter set is discrete, the empirical Bayes posterior distribution induced by the maximum marginal likelihood estimator can be regarded as a variational approximation to a hierarchical Bayes posterior distribution. This connection between empirical Bayes and variational Bayes allows us to leverage the recent results in the variational Bayes literature, and directly obtains the convergence rates of empirical Bayes posterior distributions from a variational perspective. For a more general hyperparameter set that is not necessarily discrete, we introduce a new technique called "prior decomposition" to deal with prior distributions that can be written as convex combinations of probability measures whose supports are low-dimensional subspaces. This leads to generalized versions of the classical "prior mass and testing" conditions for the convergence rates of empirical Bayes. Our theory is applied to a number of statistical estimation problems including nonparametric density estimation and sparse linear regression.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源