论文标题

通过平均剪辑的重型数据有效的私人SCO

Efficient Private SCO for Heavy-Tailed Data via Averaged Clipping

论文作者

Jin, Chenhan, Zhou, Kaiwen, Han, Bo, Cheng, James, Zeng, Tieyong

论文摘要

我们考虑对重尾数据的随机凸优化,并保证成为私人(DP)。大多数对重型数据差异私有随机凸优化的作品仅限于梯度下降(GD),或在随机梯度下降(SGD)上进行的多时间剪辑,这对于大规模问题效率低下。在本文中,我们考虑了一次一次性剪辑策略,并提供了其偏见和私人平均值的原则分析。我们为所提出的称为ACLPIPTD-DPSGD的算法建立了新的收敛结果,并改善了复杂性界限,以解决受约束和不受约束的凸问题。我们还将收敛分析扩展到强凸状情况和非平滑案例(它适用于使用H $ \ ddot {\ text {o}} $ lder-lder-continuule梯度的通用光滑目标)。所有上述结果都具有重型数据的可能性。进行数值实验以证明理论改进是合理的。

We consider stochastic convex optimization for heavy-tailed data with the guarantee of being differentially private (DP). Most prior works on differentially private stochastic convex optimization for heavy-tailed data are either restricted to gradient descent (GD) or performed multi-times clipping on stochastic gradient descent (SGD), which is inefficient for large-scale problems. In this paper, we consider a one-time clipping strategy and provide principled analyses of its bias and private mean estimation. We establish new convergence results and improved complexity bounds for the proposed algorithm called AClipped-dpSGD for constrained and unconstrained convex problems. We also extend our convergent analysis to the strongly convex case and non-smooth case (which works for generalized smooth objectives with H$\ddot{\text{o}}$lder-continuous gradients). All the above results are guaranteed with a high probability for heavy-tailed data. Numerical experiments are conducted to justify the theoretical improvement.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源