论文标题

通过物联网中无线通道分布式功能压缩的机器学习框架

A Machine Learning Framework for Distributed Functional Compression over Wireless Channels in IoT

论文作者

Saidutta, Yashas Malur, Abdi, Afshin, Fekri, Faramarz

论文摘要

物联网设备共同生成庞大的数据和最先进的机器学习技术将彻底改变网络物理系统。在许多不同的领域中,从自动驾驶到增强现实,分布式物联网设备计算特定的目标功能,而没有简单的形式,例如障碍物检测,对象识别等。传统的基于云的方法,这些方法着重于将数据传输到中心位置,以培训或推理网络资源的巨大压力。为了解决这个问题,我们据我们所知,是高斯多访问通道(GMAC)和正交AWGN渠道上的第一个用于分布式功能压缩的机器学习框架。由于Kolmogorov-Arnold代表定理,我们的机器学习框架可以通过设计来计算物联网中所需的功能压缩任务的任何任意功能。重要的是,原始感觉数据永远不会转移到中央节点进行训练或推理,从而减少了通信。对于这些算法,我们提供了沟通的理论融合保证和上限。我们的模拟表明,学到的功能压缩的编码器和解码器的性能明显优于传统方法,对通道变化和传感器的断电具有稳健性。与基于云的方案相比,我们的算法将通道的使用减少了两个数量级。

IoT devices generating enormous data and state-of-the-art machine learning techniques together will revolutionize cyber-physical systems. In many diverse fields, from autonomous driving to augmented reality, distributed IoT devices compute specific target functions without simple forms like obstacle detection, object recognition, etc. Traditional cloud-based methods that focus on transferring data to a central location either for training or inference place enormous strain on network resources. To address this, we develop, to the best of our knowledge, the first machine learning framework for distributed functional compression over both the Gaussian Multiple Access Channel (GMAC) and orthogonal AWGN channels. Due to the Kolmogorov-Arnold representation theorem, our machine learning framework can, by design, compute any arbitrary function for the desired functional compression task in IoT. Importantly the raw sensory data are never transferred to a central node for training or inference, thus reducing communication. For these algorithms, we provide theoretical convergence guarantees and upper bounds on communication. Our simulations show that the learned encoders and decoders for functional compression perform significantly better than traditional approaches, are robust to channel condition changes and sensor outages. Compared to the cloud-based scenario, our algorithms reduce channel use by two orders of magnitude.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源