论文标题

致力于灵活的设备参与联合学习

Towards Flexible Device Participation in Federated Learning

论文作者

Ruan, Yichen, Zhang, Xiaoxi, Liang, Shu-Che, Joe-Wong, Carlee

论文摘要

传统的联邦学习算法对设备的参与率施加了严格的要求,这限制了联合学习的潜在影响力。本文将当前的学习范式扩展到包括可能变得不活跃,计算不完整更新的设备,并离开或到达培训中。我们得出分析结果,以说明当数据不是独立和相同分布(非IID)时,允许更灵活的设备参与会影响学习收敛。然后,我们提出了一种新的联合聚合方案,即使设备可能不活动或返回不完整的更新,也会收敛。我们还研究学习过程如何适应早期出发或迟到,并分析它们对收敛的影响。

Traditional federated learning algorithms impose strict requirements on the participation rates of devices, which limit the potential reach of federated learning. This paper extends the current learning paradigm to include devices that may become inactive, compute incomplete updates, and depart or arrive in the middle of training. We derive analytical results to illustrate how allowing more flexible device participation can affect the learning convergence when data is not independently and identically distributed (non-IID). We then propose a new federated aggregation scheme that converges even when devices may be inactive or return incomplete updates. We also study how the learning process can adapt to early departures or late arrivals, and analyze their impacts on the convergence.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源