论文标题

Singe:通过综合梯度估计神经元相关性的稀疏性

SInGE: Sparsity via Integrated Gradients Estimation of Neuron Relevance

论文作者

Yvinec, Edouard, Dapogny, Arnaud, Cord, Matthieu, Bailly, Kevin

论文摘要

最先进的计算机视觉方法的性能飞跃归因于深度神经网络的发展。但是,它通常以计算价格可能会阻碍其部署。为了减轻这种限制,结构化的修剪是一种众所周知的技术,它包括去除通道,神经元或过滤器,并且通常用于生产更紧凑的模型。在大多数情况下,根据相对重要性标准选择要删除的计算。同时,对可解释的预测模型的需求极大地增加了,并激发了强大的归因方法的发展,该方法突出了输入图像或特征图的像素的相对重要性。在这项工作中,我们讨论了现有修剪启发式方法的局限性,其中包括基于阶段和基于梯度的方法。我们从归因方法中汲取灵感来设计一种新型的集成梯度修剪标准,在该标准中,每个神经元的相关性被定义为梯度变化在通往这种神经元去除的路径上的积分。此外,我们提出了一个纠缠的DNN修剪和微调流程图,以更好地保留DNN准确性,同时删除参数。我们通过在几个数据集,架构以及修剪场景上进行的广泛验证来展示,该方法称为Singe,显着优于现有的最新DNN修剪方法。

The leap in performance in state-of-the-art computer vision methods is attributed to the development of deep neural networks. However it often comes at a computational price which may hinder their deployment. To alleviate this limitation, structured pruning is a well known technique which consists in removing channels, neurons or filters, and is commonly applied in order to produce more compact models. In most cases, the computations to remove are selected based on a relative importance criterion. At the same time, the need for explainable predictive models has risen tremendously and motivated the development of robust attribution methods that highlight the relative importance of pixels of an input image or feature map. In this work, we discuss the limitations of existing pruning heuristics, among which magnitude and gradient-based methods. We draw inspiration from attribution methods to design a novel integrated gradient pruning criterion, in which the relevance of each neuron is defined as the integral of the gradient variation on a path towards this neuron removal. Furthermore, we propose an entwined DNN pruning and fine-tuning flowchart to better preserve DNN accuracy while removing parameters. We show through extensive validation on several datasets, architectures as well as pruning scenarios that the proposed method, dubbed SInGE, significantly outperforms existing state-of-the-art DNN pruning methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源