论文标题

Hybnn和Fedhybnn :(联邦)混合二进制神经网络

HyBNN and FedHyBNN: (Federated) Hybrid Binary Neural Networks

论文作者

Dua, Kinshuk

论文摘要

二进制神经网络(BNN),具有权重和激活的神经网络约束至-1(0)和+1,是深度神经网络的替代方法,可提供更快的培训,较低的存储器消耗和轻量级模型,非常适合用于资源约束设备,同时可以利用其深度神经网络对立面的建筑。但是,BNN中使用的输入二进制步骤会导致严重的精度损失。在本文中,我们介绍了一种新型的混合神经网络体系结构,混合二元神经网络(HYBNN),由与任务无关的,一般,全精度的变异自动编码器组成,具有二进制潜在空间和任务特定的二元神经网络,这些自动网络可以通过完整的专用型量限制了由于输入型的精确量,从而极大地限制了由完整的自动化变体限制的精确损失。我们使用它将深神网络的最新精度与更快的训练时间,更快的测试时间推理和二进制神经网络的功率效率相结合。我们表明,我们提出的系统能够非常优于具有输入二元化的香草二进制神经网络。我们还介绍了FedHybnn,这是一种高度沟通的有效的联邦对应物,并证明它能够达到与其非填充等效物相同的准确性。我们在以下网址提供源代码,实验参数和模型可用:https://anonymon.4open.science/r/hybnn。

Binary Neural Networks (BNNs), neural networks with weights and activations constrained to -1(0) and +1, are an alternative to deep neural networks which offer faster training, lower memory consumption and lightweight models, ideal for use in resource constrained devices while being able to utilize the architecture of their deep neural network counterpart. However, the input binarization step used in BNNs causes a severe accuracy loss. In this paper, we introduce a novel hybrid neural network architecture, Hybrid Binary Neural Network (HyBNN), consisting of a task-independent, general, full-precision variational autoencoder with a binary latent space and a task specific binary neural network that is able to greatly limit the accuracy loss due to input binarization by using the full precision variational autoencoder as a feature extractor. We use it to combine the state-of-the-art accuracy of deep neural networks with the much faster training time, quicker test-time inference and power efficiency of binary neural networks. We show that our proposed system is able to very significantly outperform a vanilla binary neural network with input binarization. We also introduce FedHyBNN, a highly communication efficient federated counterpart to HyBNN and demonstrate that it is able to reach the same accuracy as its non-federated equivalent. We make our source code, experimental parameters and models available at: https://anonymous.4open.science/r/HyBNN.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源