论文标题

通过对比度互补标签来增强半监督学习

Boosting Semi-Supervised Learning with Contrastive Complementary Labeling

论文作者

Deng, Qinyi, Guo, Yong, Yang, Zhibang, Pan, Haolin, Chen, Jian

论文摘要

半监督学习(SSL)在利用大量未标记的数据来学习有前途的分类器方面取得了巨大的成功。一种流行的方法是伪标记,仅针对那些具有高信心预测的未标记数据生成伪标签。至于低信心的方法,现有的方法通常只是丢弃它们,因为这些不可靠的伪标签可能会误导该模型。然而,我们强调的是,这些具有低信心伪标签的数据仍然对培训过程仍然有益。具体而言,尽管预测中概率最高的类是不可靠的,但我们可以假设该样本不太可能属于最低概率的类。这样,如果我们可以有效利用这些互补标签,即样本不属于的类,也可以非常有用。受此启发,我们提出了一种新颖的对比互补标记(CCL)方法,该方法基于互补标签构建了大量可靠的负对对,并采用对比度学习以利用所有未标记的数据。广泛的实验表明,CCL显着提高了现有方法的性能。更重要的是,我们的CCL在标签 - 筛分设置下特别有效。例如,我们仅使用40个标记的数据,在CIFAR-10上的FixMatch比FixMatch的提高2.43%。

Semi-supervised learning (SSL) has achieved great success in leveraging a large amount of unlabeled data to learn a promising classifier. A popular approach is pseudo-labeling that generates pseudo labels only for those unlabeled data with high-confidence predictions. As for the low-confidence ones, existing methods often simply discard them because these unreliable pseudo labels may mislead the model. Nevertheless, we highlight that these data with low-confidence pseudo labels can be still beneficial to the training process. Specifically, although the class with the highest probability in the prediction is unreliable, we can assume that this sample is very unlikely to belong to the classes with the lowest probabilities. In this way, these data can be also very informative if we can effectively exploit these complementary labels, i.e., the classes that a sample does not belong to. Inspired by this, we propose a novel Contrastive Complementary Labeling (CCL) method that constructs a large number of reliable negative pairs based on the complementary labels and adopts contrastive learning to make use of all the unlabeled data. Extensive experiments demonstrate that CCL significantly improves the performance on top of existing methods. More critically, our CCL is particularly effective under the label-scarce settings. For example, we yield an improvement of 2.43% over FixMatch on CIFAR-10 only with 40 labeled data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源