论文标题

Mud-PQFED:在量化的联合学习中,迈向恶意用户检测

MUD-PQFed: Towards Malicious User Detection in Privacy-Preserving Quantized Federated Learning

论文作者

Ma, Hua, Li, Qun, Zheng, Yifeng, Zhang, Zhi, Liu, Xiaoning, Gao, Yansong, Al-Sarawi, Said F., Abbott, Derek

论文摘要

Federated Learning(FL)是一种分布式机器学习范式,已适应缓解客户隐私问题。尽管他们吸引人,但仍有各种推理攻击可以利用共享 - plaintext模型更新到嵌入客户私人信息的痕迹,从而导致严重的隐私问题。为了减轻此隐私问题,已使用加密技术(例如安全的多方计算和同质加密)用于保护隐私。但是,保存隐私的FL中的此类安全问题阐明了较差且未被忽视。这项工作是阐明基于轻量级秘密共享对隐私保护FL进行模型腐败攻击的琐事的首次尝试。我们考虑在这种情况下量化模型更新以减少通信开销的方案,在这种情况下,对手可以简单地提供法律范围之外的本地参数来破坏模型。然后,我们提出了Mud-PQFED协议,该协议可以准确地检测出恶意客户进行攻击并强制执行公平的处罚。通过删除检测到的恶意客户的贡献,全球模型实用程序可以与无攻击的基线全球模型相媲美。广泛的实验验证了维持基线准确性和以细粒度检测恶意客户的有效性

Federated Learning (FL), a distributed machine learning paradigm, has been adapted to mitigate privacy concerns for customers. Despite their appeal, there are various inference attacks that can exploit shared-plaintext model updates to embed traces of customer private information, leading to serious privacy concerns. To alleviate this privacy issue, cryptographic techniques such as Secure Multi-Party Computation and Homomorphic Encryption have been used for privacy-preserving FL. However, such security issues in privacy-preserving FL are poorly elucidated and underexplored. This work is the first attempt to elucidate the triviality of performing model corruption attacks on privacy-preserving FL based on lightweight secret sharing. We consider scenarios in which model updates are quantized to reduce communication overhead in this case, where an adversary can simply provide local parameters outside the legal range to corrupt the model. We then propose the MUD-PQFed protocol, which can precisely detect malicious clients performing attacks and enforce fair penalties. By removing the contributions of detected malicious clients, the global model utility is preserved to be comparable to the baseline global model without the attack. Extensive experiments validate effectiveness in maintaining baseline accuracy and detecting malicious clients in a fine-grained manner

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源