论文标题
在线半监督一般持续学习的对比学习
Contrastive Learning for Online Semi-Supervised General Continual Learning
论文作者
论文摘要
我们使用缺少的标签研究在线持续学习,并提出Semicon,这是一种针对部分标记数据设计的新对比损失。我们通过设计一种基于内存的方法在未标记的数据流中训练的基于内存的方法来证明其效率,在该方法中,使用Oracle添加到内存中的每个数据都标记为记忆。当很少的标签可用时,我们的方法优于现有的半监督方法,并且获得与最先进的监督方法相似的结果,而在拆分cifar10上仅使用2.6%的标签,而在split-cifar100上仅使用标签的10%。
We study Online Continual Learning with missing labels and propose SemiCon, a new contrastive loss designed for partly labeled data. We demonstrate its efficiency by devising a memory-based method trained on an unlabeled data stream, where every data added to memory is labeled using an oracle. Our approach outperforms existing semi-supervised methods when few labels are available, and obtain similar results to state-of-the-art supervised methods while using only 2.6% of labels on Split-CIFAR10 and 10% of labels on Split-CIFAR100.