论文标题

蒙特卡洛·拉德马赫(Monte Carlo Rademacher

Sharper convergence bounds of Monte Carlo Rademacher Averages through Self-Bounding functions

论文作者

Pellegrina, Leonardo

论文摘要

我们得出了蒙特卡洛经验Rademacher平均值(MCERA)的概率浓度界限,这是通过最新结果证明了自我结合功能的浓度。我们的新颖界限的特征是收敛速率取决于所考虑的一组功能的数据依赖性特征数量,例如经验w弱方差,这是基本的改进W.R.T.基于有限差异方法的标准界限。因此,我们的新结果适用于对(本地)Rademacher平均值产生更大的界限。我们还通过应用Bousquet的不平等和新颖的数据依赖性界限来计算MCERA的特殊情况,从而得出了改进的新方差依赖性边界,在这种情况下,只有一个rademacher随机变量来计算MCERA。然后,我们利用自我限制功能的框架来推导新的概率界限到至上偏差,这可能是独立的。

We derive sharper probabilistic concentration bounds for the Monte Carlo Empirical Rademacher Averages (MCERA), which are proved through recent results on the concentration of self-bounding functions. Our novel bounds are characterized by convergence rates that depend on data-dependent characteristic quantities of the set of functions under consideration, such as the empirical wimpy variance, an essential improvement w.r.t. standard bounds based on the methods of bounded differences. For this reason, our new results are applicable to yield sharper bounds to (Local) Rademacher Averages. We also derive improved novel variance-dependent bounds for the special case where only one vector of Rademacher random variables is used to compute the MCERA, through the application of Bousquet's inequality and novel data-dependent bounds to the wimpy variance. Then, we leverage the framework of self-bounding functions to derive novel probabilistic bounds to the supremum deviations, that may be of independent interest.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源