论文标题

通过对比度学习来学习深度表示,例如检索

Learning Deep Representations via Contrastive Learning for Instance Retrieval

论文作者

Wu, Tao, Luo, Tie, Wunsch, Donald

论文摘要

实例级图像检索(IIR)或简单的实例检索,涉及在数据集中查找包含查询实例的所有图像的问题(例如对象)。本文首次尝试使用基于实例歧视的对比学习(CL)解决此问题。尽管CL在许多计算机视觉任务中表现出令人印象深刻的性能,但在IIR领域也从未找到过类似的成功。在这项工作中,我们通过探索从预先训练和微调的CL模型中得出歧视性表示的能力来解决此问题。首先,我们通过比较预先训练的深度神经网络(DNN)分类器与CL模型学到的功能相比,通过比较IIR转移学习的功效。这些发现启发了我们提出了一种新的培训策略,通过使用平均精度(AP)损失以及微调方法来学习针对IIR量身定制的对比功能表示,从而优化CL以学习为导向IIR的功能。我们的经验评估表明,从挑战性牛津和巴黎数据集中的预先培训的DNN分类器中学到的现成的特征上的性能提高了。

Instance-level Image Retrieval (IIR), or simply Instance Retrieval, deals with the problem of finding all the images within an dataset that contain a query instance (e.g. an object). This paper makes the first attempt that tackles this problem using instance-discrimination based contrastive learning (CL). While CL has shown impressive performance for many computer vision tasks, the similar success has never been found in the field of IIR. In this work, we approach this problem by exploring the capability of deriving discriminative representations from pre-trained and fine-tuned CL models. To begin with, we investigate the efficacy of transfer learning in IIR, by comparing off-the-shelf features learned by a pre-trained deep neural network (DNN) classifier with features learned by a CL model. The findings inspired us to propose a new training strategy that optimizes CL towards learning IIR-oriented features, by using an Average Precision (AP) loss together with a fine-tuning method to learn contrastive feature representations that are tailored to IIR. Our empirical evaluation demonstrates significant performance enhancement over the off-the-shelf features learned from a pre-trained DNN classifier on the challenging Oxford and Paris datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源