论文标题
层次结构FL:通过分层自我验证的异质联邦学习
HierarchyFL: Heterogeneous Federated Learning via Hierarchical Self-Distillation
论文作者
论文摘要
联合学习(FL)已被公认为是隐私的分布式机器学习范式,可通过集中的全球模型聚合在各种异质人工智能(AIOT)设备之间进行知识共享。由于涉及的Aiot设备的模型异质性,FL遭受模型不准确和缓慢的收敛性。尽管各种现有方法试图解决模型异质性问题的瓶颈,但大多数方法以粗粒的方式提高了异质模型的准确性,这使得部署大型Aiot设备仍然是一个巨大的挑战。为了减轻此问题的负面影响并充分利用了每个异质模型的多样性,我们提出了一个名为HierArchyfl的有效框架,该框架使用少量的公共数据来实现各种不同结构化模型的有效和可扩展知识。通过使用自我介绍和我们提出的集合库,每个分层模型都可以在云服务器上智能地互相学习。各种知名数据集的实验结果表明,层次结构不仅可以最大程度地提高大规模Aiot系统中各种异质模型之间的知识共享,而且还可以大大提高每个涉及的异质Aiot设备的模型性能。
Federated learning (FL) has been recognized as a privacy-preserving distributed machine learning paradigm that enables knowledge sharing among various heterogeneous artificial intelligence (AIoT) devices through centralized global model aggregation. FL suffers from model inaccuracy and slow convergence due to the model heterogeneity of the AIoT devices involved. Although various existing methods try to solve the bottleneck of the model heterogeneity problem, most of them improve the accuracy of heterogeneous models in a coarse-grained manner, which makes it still a great challenge to deploy large-scale AIoT devices. To alleviate the negative impact of this problem and take full advantage of the diversity of each heterogeneous model, we propose an efficient framework named HierarchyFL, which uses a small amount of public data for efficient and scalable knowledge across a variety of differently structured models. By using self-distillation and our proposed ensemble library, each hierarchical model can intelligently learn from each other on cloud servers. Experimental results on various well-known datasets show that HierarchyFL can not only maximize the knowledge sharing among various heterogeneous models in large-scale AIoT systems, but also greatly improve the model performance of each involved heterogeneous AIoT device.