论文标题

使用半准编程学习:基于固定点分析和多余风险曲率的新统计界限

Learning with Semi-Definite Programming: new statistical bounds based on fixed point analysis and excess risk curvature

论文作者

Chrétien, Stéphane, Cucuringu, Mihai, Lecué, Guillaume, Neirac, Lucie

论文摘要

最近,许多统计学习问题被证明是半准编程(SDP)的,在高斯混合模型中,社区检测和聚类是最引人注目的实例[Javanmard等,2016]。鉴于基于SDP的技术在机器学习问题上的应用不断增长,以及用于解决SDP的有效算法设计方面的快速进步,一个有趣的问题是了解如何将经验过程理论的最新进展用于实现,以便提供精确的SDP估计统计分析。 在本文中,我们从学习理论文献中借用了最先进的技术和概念,例如固定点方程和多余的风险曲率参数,这些论点产生了一类广泛的SDP估计器的一般估计和预测结果。从这个角度来看,我们从[Guédon等人,2016]和[Chen等,2016]中重新审视了社区检测中的一些经典结果,并且我们获得了用于签名聚类,组同步和MaxCut的SDP估计器的统计保证。

Many statistical learning problems have recently been shown to be amenable to Semi-Definite Programming (SDP), with community detection and clustering in Gaussian mixture models as the most striking instances [javanmard et al., 2016]. Given the growing range of applications of SDP-based techniques to machine learning problems, and the rapid progress in the design of efficient algorithms for solving SDPs, an intriguing question is to understand how the recent advances from empirical process theory can be put to work in order to provide a precise statistical analysis of SDP estimators. In the present paper, we borrow cutting edge techniques and concepts from the learning theory literature, such as fixed point equations and excess risk curvature arguments, which yield general estimation and prediction results for a wide class of SDP estimators. From this perspective, we revisit some classical results in community detection from [guédon et al.,2016] and [chen et al., 2016], and we obtain statistical guarantees for SDP estimators used in signed clustering, group synchronization and MAXCUT.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源