论文标题
FedDuap:使用服务器上的共享数据进行动态更新和自适应修剪的联合学习
FedDUAP: Federated Learning with Dynamic Update and Adaptive Pruning Using Shared Data on the Server
论文作者
论文摘要
尽管表现出色,但联邦学习(FL)仍面临两个关键挑战,即计算资源有限和培训效率低。在本文中,我们提出了一个新颖的FL框架,即用两种原始贡献来利用服务器上的不敏感数据和边缘设备中的分散数据,以进一步提高培训效率。首先,动态服务器更新算法旨在利用服务器上的不敏感数据,以动态确定服务器更新的最佳步骤,以提高全局模型的收敛性和准确性。其次,开发了一种层自适应模型修剪方法,以执行适合多层不同维度和重要性的独特修剪操作,以在效率和有效性之间取得良好的平衡。通过将两种原始技术集成在一起,我们提议的FL模型FedDuap在准确性(高达4.8%),效率(最高2.8倍)和计算成本(高达61.9%)方面明显优于基线方法。
Despite achieving remarkable performance, Federated Learning (FL) suffers from two critical challenges, i.e., limited computational resources and low training efficiency. In this paper, we propose a novel FL framework, i.e., FedDUAP, with two original contributions, to exploit the insensitive data on the server and the decentralized data in edge devices to further improve the training efficiency. First, a dynamic server update algorithm is designed to exploit the insensitive data on the server, in order to dynamically determine the optimal steps of the server update for improving the convergence and accuracy of the global model. Second, a layer-adaptive model pruning method is developed to perform unique pruning operations adapted to the different dimensions and importance of multiple layers, to achieve a good balance between efficiency and effectiveness. By integrating the two original techniques together, our proposed FL model, FedDUAP, significantly outperforms baseline approaches in terms of accuracy (up to 4.8% higher), efficiency (up to 2.8 times faster), and computational cost (up to 61.9% smaller).