论文标题
带有压缩通信的基于快速区块链的联合学习框架
A Fast Blockchain-based Federated Learning Framework with Compressed Communications
论文作者
论文摘要
最近,基于区块链的联合学习(BFL)引起了密集的研究关注,因为训练过程是可审核的,并且该体系结构无助于避免了Vanilla Federated学习(VFL)中参数服务器的单点故障。然而,BFL大大升级了通信流量量,因为BFL客户端获得的所有本地模型更新(即,模型参数的更改)都将被传输到所有矿工以进行验证,并向所有客户端进行聚合。相比之下,参数服务器和VFL中的客户端仅保留汇总模型更新。因此,BFL的巨大沟通流量将不可避免地损害培训效率,并阻碍BFL现实的部署。为了提高BFL的实用性,我们是第一个提出基于快速区块链的联合学习框架,通过压缩BFL中的通信(称为BCFL)。同时,我们得出了BCFL的收敛速率,而非凸损失损失。为了最大化最终模型的准确性,我们进一步提出了问题,以最大程度地减少收敛速率的训练损失,但相对于压缩率和块生成率,训练时间有限,这是BI-CONVEX优化问题,并且可以有效地解决。最后,为了证明BCFL的效率,我们对标准CIFAR-10和女权数据集进行了广泛的实验。我们的实验结果不仅验证了我们的分析的正确性,而且还表明BCFL可以显着将通信流量降低95-98%,或者与BFL相比,训练时间缩短了90-95%。
Recently, blockchain-based federated learning (BFL) has attracted intensive research attention due to that the training process is auditable and the architecture is serverless avoiding the single point failure of the parameter server in vanilla federated learning (VFL). Nevertheless, BFL tremendously escalates the communication traffic volume because all local model updates (i.e., changes of model parameters) obtained by BFL clients will be transmitted to all miners for verification and to all clients for aggregation. In contrast, the parameter server and clients in VFL only retain aggregated model updates. Consequently, the huge communication traffic in BFL will inevitably impair the training efficiency and hinder the deployment of BFL in reality. To improve the practicality of BFL, we are among the first to propose a fast blockchain-based communication-efficient federated learning framework by compressing communications in BFL, called BCFL. Meanwhile, we derive the convergence rate of BCFL with non-convex loss. To maximize the final model accuracy, we further formulate the problem to minimize the training loss of the convergence rate subject to a limited training time with respect to the compression rate and the block generation rate, which is a bi-convex optimization problem and can be efficiently solved. To the end, to demonstrate the efficiency of BCFL, we carry out extensive experiments with standard CIFAR-10 and FEMNIST datasets. Our experimental results not only verify the correctness of our analysis, but also manifest that BCFL can remarkably reduce the communication traffic by 95-98% or shorten the training time by 90-95% compared with BFL.