论文标题

通过类原型来解决联合学习中的数据异质性

Tackling Data Heterogeneity in Federated Learning with Class Prototypes

论文作者

Dai, Yutong, Chen, Zeyuan, Li, Junnan, Heinecke, Shelby, Sun, Lichao, Xu, Ran

论文摘要

联合学习(FL)设置中客户跨客户的数据异质性是一个广泛认可的挑战。作为回应,个性化联合学习(PFL)作为框架策划了客户任务的本地模型。在PFL中,一个共同的策略是共同开发本地和全球模型 - 全球模型(用于泛化)通知本地模型,并且汇总了本地模型(用于个性化)以更新全球模型。一个关键观察是,如果我们可以提高本地模型的概括能力,那么我们可以改善全球模型的概括,从而构建更好的个性化模型。在这项工作中,我们考虑类不平衡,一种在分类设置中被忽略的数据异质性类型。我们提出了FedNH,这是一种新颖的方法,可以通过结合班级原型的统一性和语义来提高本地模型的个性化和概括。 FedNH最初在潜在空间中均匀地分布了类原型,并将类语义融合到类原型中。我们表明,施加均匀性有助于打击原型崩溃,同时融合了类语义可以改善本地模型。在跨设备设置下,在流行的分类数据集上进行了广泛的实验。我们的结果证明了我们方法对最近的工作的有效性和稳定性。

Data heterogeneity across clients in federated learning (FL) settings is a widely acknowledged challenge. In response, personalized federated learning (PFL) emerged as a framework to curate local models for clients' tasks. In PFL, a common strategy is to develop local and global models jointly - the global model (for generalization) informs the local models, and the local models (for personalization) are aggregated to update the global model. A key observation is that if we can improve the generalization ability of local models, then we can improve the generalization of global models, which in turn builds better personalized models. In this work, we consider class imbalance, an overlooked type of data heterogeneity, in the classification setting. We propose FedNH, a novel method that improves the local models' performance for both personalization and generalization by combining the uniformity and semantics of class prototypes. FedNH initially distributes class prototypes uniformly in the latent space and smoothly infuses the class semantics into class prototypes. We show that imposing uniformity helps to combat prototype collapse while infusing class semantics improves local models. Extensive experiments were conducted on popular classification datasets under the cross-device setting. Our results demonstrate the effectiveness and stability of our method over recent works.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源