论文标题

在异质联盟学习中缩小客户与全球模型绩效之间的差距

Closing the Gap between Client and Global Model Performance in Heterogeneous Federated Learning

论文作者

Shi, Hongrui, Radu, Valentin, Yang, Po

论文摘要

硬件和数据的异质性是在异质设置下运行的联合学习(FL)社区中的一个知名且研究的问题。最近,经过知识蒸馏(KD)培训的定制大小客户模型已成为应对异质性挑战的可行策略。但是,以前在这个方向上的努力旨在调整客户模型,而不是对全球模型知识汇总的影响。尽管全球模型的性能是FL系统的主要目标,但在异质设置下,客户模型受到了更多关注。在这里,我们提供了更多有关培训自定义客户模型如何影响全球模型的方法,这对于任何FL应用都是必不可少的。我们显示,全球模型可以通过异质数据充分利用KD的强度。在经验观察的推动下,我们进一步提出了一种新方法,将KD和学习结合在一起而不忘记(LWOF)以产生改进的个性化模型。在现实的部署场景中,我们将与强大的同质佛罗里达州的强大的联盟搭配起来。

The heterogeneity of hardware and data is a well-known and studied problem in the community of Federated Learning (FL) as running under heterogeneous settings. Recently, custom-size client models trained with Knowledge Distillation (KD) has emerged as a viable strategy for tackling the heterogeneity challenge. However, previous efforts in this direction are aimed at client model tuning rather than their impact onto the knowledge aggregation of the global model. Despite performance of global models being the primary objective of FL systems, under heterogeneous settings client models have received more attention. Here, we provide more insights into how the chosen approach for training custom client models has an impact on the global model, which is essential for any FL application. We show the global model can fully leverage the strength of KD with heterogeneous data. Driven by empirical observations, we further propose a new approach that combines KD and Learning without Forgetting (LwoF) to produce improved personalised models. We bring heterogeneous FL on pair with the mighty FedAvg of homogeneous FL, in realistic deployment scenarios with dropping clients.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源