论文标题
通过降低客户端方差和自适应服务器更新的非IID数据联合学习
Federated Learning for Non-IID Data via Client Variance Reduction and Adaptive Server Update
论文作者
论文摘要
联合学习(FL)是一种新兴技术,用于协作训练全球机器学习模型,同时将数据局部定位在用户设备上。 FL实施实施的主要障碍是用户之间的非独立且相同的(非IID)数据分布,这会减慢收敛性和降低性能。为了解决这个基本问题,我们提出了一种方法(昏迷),以增强客户端和服务器侧的整个培训过程。舒适的关键思想是同时利用客户端变量减少技术来促进服务器聚合和全局自适应更新技术来加速学习。我们在CIFAR-10分类任务上进行的实验表明,Comfed可以改善专用于非IID数据的最新算法。
Federated learning (FL) is an emerging technique used to collaboratively train a global machine learning model while keeping the data localized on the user devices. The main obstacle to FL's practical implementation is the Non-Independent and Identical (Non-IID) data distribution across users, which slows convergence and degrades performance. To tackle this fundamental issue, we propose a method (ComFed) that enhances the whole training process on both the client and server sides. The key idea of ComFed is to simultaneously utilize client-variance reduction techniques to facilitate server aggregation and global adaptive update techniques to accelerate learning. Our experiments on the Cifar-10 classification task show that ComFed can improve state-of-the-art algorithms dedicated to Non-IID data.