论文标题

替代方案:遗传分类器的直接知识蒸馏以识别面部

ProxylessKD: Direct Knowledge Distillation with Inherited Classifier for Face Recognition

论文作者

Shi, Weidong, Ren, Guanghui, Chen, Yunpeng, Yan, Shuicheng

论文摘要

知识蒸馏(KD)是指将知识从大型模型转移到较小的模型,该模型广泛用于增强机器学习中的模型性能。它试图与教师和学生模型产生的嵌入空间保持一致(即使与相同语义相对应的图像在不同模型上共享相同的嵌入)。在这项工作中,我们将重点放在面部识别中的应用上。我们观察到,现有的知识蒸馏模型优化了迫使学生模仿教师行为的代理任务,而不是直接优化面部识别精度。因此,所获得的学生模型不能保证在目标任务上是最佳的,或者能够从高级约束中受益,例如较大的保证金约束(例如,基于保证金的SoftMax)。然后,我们提出了一种名为proxlesskd的新颖方法,该方法通过继承教师的分类器作为学生的分类器来直接优化面部识别精度,以指导学生在教师的嵌入空间中学习判别性嵌入。提出的近亲KD非常容易实施,并且足够通用,可以扩展到其他超出面部识别的任务。我们对标准面部识别基准进行了广泛的实验,结果表明,近亲KD比现有知识蒸馏方法实现了卓越的性能。

Knowledge Distillation (KD) refers to transferring knowledge from a large model to a smaller one, which is widely used to enhance model performance in machine learning. It tries to align embedding spaces generated from the teacher and the student model (i.e. to make images corresponding to the same semantics share the same embedding across different models). In this work, we focus on its application in face recognition. We observe that existing knowledge distillation models optimize the proxy tasks that force the student to mimic the teacher's behavior, instead of directly optimizing the face recognition accuracy. Consequently, the obtained student models are not guaranteed to be optimal on the target task or able to benefit from advanced constraints, such as large margin constraints (e.g. margin-based softmax). We then propose a novel method named ProxylessKD that directly optimizes face recognition accuracy by inheriting the teacher's classifier as the student's classifier to guide the student to learn discriminative embeddings in the teacher's embedding space. The proposed ProxylessKD is very easy to implement and sufficiently generic to be extended to other tasks beyond face recognition. We conduct extensive experiments on standard face recognition benchmarks, and the results demonstrate that ProxylessKD achieves superior performance over existing knowledge distillation methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源