论文标题

通过弹性网的沟通效率和漂移稳定的联合学习

Communication-Efficient and Drift-Robust Federated Learning via Elastic Net

论文作者

Kim, Seonhyeong, Woo, Jiheon, Seo, Daewon, Kim, Yongjune

论文摘要

联合学习(FL)是一种分布式方法,可以在一组本地客户端培训全球模型的同时保持数据本地化。它降低了隐私和安全的风险,但面临着重要的挑战,包括昂贵的沟通成本和客户漂移问题。为了解决这些问题,我们提出了Fedelasticnet,这是一个利用弹性网的通信效率和漂移射击的FL框架。 It repurposes two types of the elastic net regularizers (i.e., $\ell_1$ and $\ell_2$ penalties on the local model updates): (1) the $\ell_1$-norm regularizer sparsifies the local updates to reduce the communication costs and (2) the $\ell_2$-norm regularizer resolves the client drift problem by limiting the impact of drifting local updates due to data异质性。 fedelasticnet是FL的一般框架。因此,如果没有额外的成本,就可以将其集成到先前的FL技术中,例如FedAvg,FedProx,脚手架和Feddyn。我们表明,我们的框架有效地解决了交流成本和客户漂移问题。

Federated learning (FL) is a distributed method to train a global model over a set of local clients while keeping data localized. It reduces the risks of privacy and security but faces important challenges including expensive communication costs and client drift issues. To address these issues, we propose FedElasticNet, a communication-efficient and drift-robust FL framework leveraging the elastic net. It repurposes two types of the elastic net regularizers (i.e., $\ell_1$ and $\ell_2$ penalties on the local model updates): (1) the $\ell_1$-norm regularizer sparsifies the local updates to reduce the communication costs and (2) the $\ell_2$-norm regularizer resolves the client drift problem by limiting the impact of drifting local updates due to data heterogeneity. FedElasticNet is a general framework for FL; hence, without additional costs, it can be integrated into prior FL techniques, e.g., FedAvg, FedProx, SCAFFOLD, and FedDyn. We show that our framework effectively resolves the communication cost and client drift problems simultaneously.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源