论文标题

降低到降低:基于梯度的方法的方法

Descent-to-Delete: Gradient-Based Methods for Machine Unlearning

论文作者

Neel, Seth, Roth, Aaron, Sharifi-Malvajerdi, Saeed

论文摘要

我们研究了凸模型的数据删除问题。通过利用凸优化和储层采样的技术,我们提供了第一个数据删除算法,这些算法能够处理任意长的对抗性更新序列,同时有望在每次运行时和稳态误差中都不会随着更新序列的长度而增长。我们还介绍了几个新的概念区别:例如,我们可以要求在删除后,优化算法维护的整个状态与国家在统计学上是无法区分的,如果我们进行了重新训练,或者我们可以要求只有可观察到的产出与可观察到的产出统计学上的产出所产生的较弱的状态,从而导致了无法重新引起的产出。在此较弱的缺失标准下,我们能够提供更有效的缺失算法。

We study the data deletion problem for convex models. By leveraging techniques from convex optimization and reservoir sampling, we give the first data deletion algorithms that are able to handle an arbitrarily long sequence of adversarial updates while promising both per-deletion run-time and steady-state error that do not grow with the length of the update sequence. We also introduce several new conceptual distinctions: for example, we can ask that after a deletion, the entire state maintained by the optimization algorithm is statistically indistinguishable from the state that would have resulted had we retrained, or we can ask for the weaker condition that only the observable output is statistically indistinguishable from the observable output that would have resulted from retraining. We are able to give more efficient deletion algorithms under this weaker deletion criterion.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源