论文标题
探索内容关系,以提炼高效gan
Exploring Content Relationships for Distilling Efficient GANs
论文作者
论文摘要
本文提出了一种内容关系蒸馏(CRD),以解决过度参数化的生成对抗网络(GAN),以实现尖端设备的可用性。与传统的实例级蒸馏相反,我们通过将教师成果的内容切成多种细粒度的粒度,例如行/柱状(全球信息)和图像贴片(本地信息),对它们之间的关系进行建模,例如成对的距离距离和三胞胎角度,并鼓励其在学生中的关系,并捕获其成果的关系。基于我们提出的内容级蒸馏,我们还部署了一个在线教师歧视者,该歧视者在与教师生成器共同培训时不断更新,并在与学生生成器共同培训时不断冻结以进行更好的对抗性培训。我们在三个基准数据集上进行了广泛的实验,结果表明,与现有方法相比,我们的CRD达到了gan的最高功能,同时获得了最佳性能。例如,我们将Cyclegan的MAC降低了约40倍,并且参数将超过80倍减少,与此同时,与当前最新的ART相比,获得了46.61 FID。该项目的代码可在https://github.com/thekernelz/crd上找到。
This paper proposes a content relationship distillation (CRD) to tackle the over-parameterized generative adversarial networks (GANs) for the serviceability in cutting-edge devices. In contrast to traditional instance-level distillation, we design a novel GAN compression oriented knowledge by slicing the contents of teacher outputs into multiple fine-grained granularities, such as row/column strips (global information) and image patches (local information), modeling the relationships among them, such as pairwise distance and triplet-wise angle, and encouraging the student to capture these relationships within its output contents. Built upon our proposed content-level distillation, we also deploy an online teacher discriminator, which keeps updating when co-trained with the teacher generator and keeps freezing when co-trained with the student generator for better adversarial training. We perform extensive experiments on three benchmark datasets, the results of which show that our CRD reaches the most complexity reduction on GANs while obtaining the best performance in comparison with existing methods. For example, we reduce MACs of CycleGAN by around 40x and parameters by over 80x, meanwhile, 46.61 FIDs are obtained compared with these of 51.92 for the current state-of-the-art. Code of this project is available at https://github.com/TheKernelZ/CRD.