论文标题

通过收缩系数对迭代算法的隐私扩增

Privacy Amplification of Iterative Algorithms via Contraction Coefficients

论文作者

Asoodeh, Shahab, Diaz, Mario, Calmon, Flavio P.

论文摘要

我们从信息理论镜头提出的最近提出的迭代框架通过迭代研究了隐私放大的框架。我们证明,迭代映射的差异隐私保证可以通过直接应用$ f $ divergences的强大数据处理不平等的收缩系数的直接应用来确定。特别是,通过将Dobrushin的收缩系数推广到$ f $ -DDIVERGENCE被称为$e_γ$ -Divergence,我们在预计嘈杂的随机梯度梯度下降算法的差异隐私参数上,带有隐藏的中间更新。

We investigate the framework of privacy amplification by iteration, recently proposed by Feldman et al., from an information-theoretic lens. We demonstrate that differential privacy guarantees of iterative mappings can be determined by a direct application of contraction coefficients derived from strong data processing inequalities for $f$-divergences. In particular, by generalizing the Dobrushin's contraction coefficient for total variation distance to an $f$-divergence known as $E_γ$-divergence, we derive tighter bounds on the differential privacy parameters of the projected noisy stochastic gradient descent algorithm with hidden intermediate updates.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源