论文标题

启用D2D的异质网络中的分布式机器学习:体系结构,性能和开放挑战

Distributed Machine Learning in D2D-Enabled Heterogeneous Networks: Architectures, Performance, and Open Challenges

论文作者

Cheng, Zhipeng, Fan, Xuwei, Liwang, Minghui, Chen, Ning, Xia, Xiaoyu, Wang, Xianbin

论文摘要

关于数据隐私的不断增长的担忧导致机器学习(ML)架构从集中式的方法转变为分布式方法,从而引起了联邦学习(FL)和分裂学习(SL),这是两种主要的隐私性支持ML机制。但是,在设备到设备(D2D)具有不同客户的异质网络中实施FL或SL面临着重大挑战,包括架构可伸缩性和延长的培训延迟。为了应对这些挑战,本文介绍了两个创新的混合分布式ML架构,即Hybrid Split FL(HSFL)和Hybrid Federated SL(HFSL)。这种体系结构结合了启用D2D的异质无线网络中FL和SL的优势。我们对HSFL和HFSL的性能和优势进行了全面分析,同时还强调了未来探索的开放挑战。我们使用非独立和非相同分布的设置中的三个数据集使用初步模拟来支持我们的建议,证明了我们的体系结构的可行性。我们的模拟显示与传统的FL和SL相比,沟通/计算成本和培训延迟显着降低。

The ever-growing concerns regarding data privacy have led to a paradigm shift in machine learning (ML) architectures from centralized to distributed approaches, giving rise to federated learning (FL) and split learning (SL) as the two predominant privacy-preserving ML mechanisms. However,implementing FL or SL in device-to-device (D2D)-enabled heterogeneous networks with diverse clients presents substantial challenges, including architecture scalability and prolonged training delays. To address these challenges, this article introduces two innovative hybrid distributed ML architectures, namely, hybrid split FL (HSFL) and hybrid federated SL (HFSL). Such architectures combine the strengths of both FL and SL in D2D-enabled heterogeneous wireless networks. We provide a comprehensive analysis of the performance and advantages of HSFL and HFSL, while also highlighting open challenges for future exploration. We support our proposals with preliminary simulations using three datasets in non-independent and non-identically distributed settings, demonstrating the feasibility of our architectures. Our simulations reveal notable reductions in communication/computation costs and training delays as compared to conventional FL and SL.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源