论文标题

XCON:与专家一起学习细粒类别发现

XCon: Learning with Experts for Fine-grained Category Discovery

论文作者

Fei, Yixin, Zhao, Zhongkai, Yang, Siwei, Zhao, Bingchen

论文摘要

我们在本文中解决了广义类别发现(GCD)的问题,即从一组可见类中利用信息的未标记的图像,其中未标记的图像可以包含可见的类和看不见的类。可以将所见类看作是类的隐式标准,这使得此设置不同于无监督的聚类,而集群标准可能模棱两可。我们主要关注在细粒数据集中发现类别的问题,因为它是类别发现的最直接应用程序之一,即帮助专家使用所见类所示的隐式标准在未标记的数据集中发现新颖概念。通用类别发现的最先进方法利用了学习表示形式的对比度学习,但是较大的类间相似性和类内差异对方法构成了挑战,因为负面示例可能包含识别类别的无关线索,因此算法可能会融合到局部米尼玛。我们提出了一种称为专家对比度学习(XCON)的新方法,以通过将数据集使用K-均值聚类将数据集划分为子数据集中,然后对每个子数据集进行对比度学习以学习细粒度的歧视性特征,从而帮助模型从图像中挖掘有用的信息。对细粒数据集的实验表明,与以前的最佳方法相比,性能明显改善,表明我们方法的有效性。

We address the problem of generalized category discovery (GCD) in this paper, i.e. clustering the unlabeled images leveraging the information from a set of seen classes, where the unlabeled images could contain both seen classes and unseen classes. The seen classes can be seen as an implicit criterion of classes, which makes this setting different from unsupervised clustering where the cluster criteria may be ambiguous. We mainly concern the problem of discovering categories within a fine-grained dataset since it is one of the most direct applications of category discovery, i.e. helping experts discover novel concepts within an unlabeled dataset using the implicit criterion set forth by the seen classes. State-of-the-art methods for generalized category discovery leverage contrastive learning to learn the representations, but the large inter-class similarity and intra-class variance pose a challenge for the methods because the negative examples may contain irrelevant cues for recognizing a category so the algorithms may converge to a local-minima. We present a novel method called Expert-Contrastive Learning (XCon) to help the model to mine useful information from the images by first partitioning the dataset into sub-datasets using k-means clustering and then performing contrastive learning on each of the sub-datasets to learn fine-grained discriminative features. Experiments on fine-grained datasets show a clear improved performance over the previous best methods, indicating the effectiveness of our method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源