论文标题

使用动态通道掩盖的深神经网络自动启动

AutoPruning for Deep Neural Network with Dynamic Channel Masking

论文作者

Li, Baopu, Fan, Yanwen, Pan, Zhihong, Zhang, Gang

论文摘要

现代深度神经网络模型很大,并且在计算中很密集。解决此问题的一种典型解决方案是模型修剪。但是,大多数当前的修剪算法取决于手工制作的规则或域专业知识。为了克服这个问题,我们提出了一种基于学习的自动修剪算法,用于深神经网络,该算法的灵感来自最近的自动机器学习(AUTOML)。首先制定了两个目标的问题,该问题旨在为每一层的权重和最佳通道。然后提出一种替代优化方法,以同时得出最佳的通道数和权重。在修剪过程中,我们利用可搜索的高参数(剩余比率)表示每个卷积层中的通道数,然后提出了动态掩蔽过程来描述相应的通道演化。为了控制模型的准确性与浮点操作的修剪比之间的权衡,进一步引入了新的损失功能。基准数据集的初步实验结果表明,我们的方案为神经网络修剪带来了竞争成果。

Modern deep neural network models are large and computationally intensive. One typical solution to this issue is model pruning. However, most current pruning algorithms depend on hand crafted rules or domain expertise. To overcome this problem, we propose a learning based auto pruning algorithm for deep neural network, which is inspired by recent automatic machine learning(AutoML). A two objectives' problem that aims for the the weights and the best channels for each layer is first formulated. An alternative optimization approach is then proposed to derive the optimal channel numbers and weights simultaneously. In the process of pruning, we utilize a searchable hyperparameter, remaining ratio, to denote the number of channels in each convolution layer, and then a dynamic masking process is proposed to describe the corresponding channel evolution. To control the trade-off between the accuracy of a model and the pruning ratio of floating point operations, a novel loss function is further introduced. Preliminary experimental results on benchmark datasets demonstrate that our scheme achieves competitive results for neural network pruning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源