论文标题

通过合并会员推断对深神经网络的强大而无损的指纹识别

Robust and Lossless Fingerprinting of Deep Neural Networks via Pooled Membership Inference

论文作者

Wu, Hanzhou

论文摘要

深度神经网络(DNNS)已经在许多应用领域取得了巨大的成功,并为我们的社会带来了深刻的变化。但是,它也引发了新的安全问题,其中如何保护DNN的知识产权(IP)免受侵权的侵权是最重要但最具挑战性的主题之一。为了解决这个问题,最近的研究通过应用数字水印来关注DNN的IP保护,该水印将源信息和/或身份验证数据嵌入DNN模型中,通过直接或间接调整网络参数。但是,调整网络参数不可避免地会扭曲DNN,因此无论性能退化的程度如何,DNN模型的最初任务都会损害DNN模型的性能。它激发了本文中的作者提出一种称为合并成员推理(PMI)的新技术,以保护DNN模型的IP。提出的PMI均未改变给定DNN模型的网络参数,也没有用一系列精心设计的触发样品来微调DNN模型。取而代之的是,它使原始的DNN模型保持不变,但是可以通过推断多个迷你数据集中的哪个迷你数据库来确定DNN模型的所有权,曾经用来训练目标DNN模型,该模型与以前的艺术不同,并且在实践中具有显着的潜力。实验还证明了这项工作的优越性和适用性。

Deep neural networks (DNNs) have already achieved great success in a lot of application areas and brought profound changes to our society. However, it also raises new security problems, among which how to protect the intellectual property (IP) of DNNs against infringement is one of the most important yet very challenging topics. To deal with this problem, recent studies focus on the IP protection of DNNs by applying digital watermarking, which embeds source information and/or authentication data into DNN models by tuning network parameters directly or indirectly. However, tuning network parameters inevitably distorts the DNN and therefore surely impairs the performance of the DNN model on its original task regardless of the degree of the performance degradation. It has motivated the authors in this paper to propose a novel technique called pooled membership inference (PMI) so as to protect the IP of the DNN models. The proposed PMI neither alters the network parameters of the given DNN model nor fine-tunes the DNN model with a sequence of carefully crafted trigger samples. Instead, it leaves the original DNN model unchanged, but can determine the ownership of the DNN model by inferring which mini-dataset among multiple mini-datasets was once used to train the target DNN model, which differs from previous arts and has remarkable potential in practice. Experiments also have demonstrated the superiority and applicability of this work.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源