论文标题
使用知识转移的糖尿病性视网膜病变图像分类的未标记数据部署
Unlabeled Data Deployment for Classification of Diabetic Retinopathy Images Using Knowledge Transfer
论文作者
论文摘要
卷积神经网络(CNN)对医学图像处理非常有益。医学图像很丰富,但是缺乏注释的数据。转移学习用于解决缺乏标记数据的问题,并授予CNNS更好的培训能力。转移学习可以用于许多不同的医疗应用中;但是,转移下的模型应具有与原始网络相同的大小。最近提出了知识蒸馏,以将模型的知识转移到另一个模型,对于涵盖转移学习的缺点很有用。但是知识的某些部分可能不会通过知识蒸馏来提炼。在本文中,提出了一种新颖的知识蒸馏,以将模型的整个知识转移到另一个知识。所提出的方法对于医学图像分析可能是有益的,可以有用,其中可用少量标记的数据。对糖尿病性视网膜病变分类进行了测试。仿真结果表明,使用提出的方法,可以将广泛网络的知识传输到较小的模型。
Convolutional neural networks (CNNs) are extensively beneficial for medical image processing. Medical images are plentiful, but there is a lack of annotated data. Transfer learning is used to solve the problem of lack of labeled data and grants CNNs better training capability. Transfer learning can be used in many different medical applications; however, the model under transfer should have the same size as the original network. Knowledge distillation is recently proposed to transfer the knowledge of a model to another one and can be useful to cover the shortcomings of transfer learning. But some parts of the knowledge may not be distilled by knowledge distillation. In this paper, a novel knowledge distillation using transfer learning is proposed to transfer the whole knowledge of a model to another one. The proposed method can be beneficial and practical for medical image analysis in which a small number of labeled data are available. The proposed process is tested for diabetic retinopathy classification. Simulation results demonstrate that using the proposed method, knowledge of an extensive network can be transferred to a smaller model.