论文标题
机器人技术的增量少数对象检测
Incremental Few-Shot Object Detection for Robotics
论文作者
论文摘要
对于实用机器人技术,高度期望逐步学习。一方面,只需使用少量带注释的培训样本即可快速,灵活地学习新任务。另一方面,这种新的其他任务应以连续而渐进的方式学习,而不必大幅度忘记以前的学识知识。在这项工作中,我们提出了一个新颖的类插入式几杆对象检测(CI-FSOD)框架,该框架使深度对象检测网络能够在不重新访问先前的培训数据的情况下从几少数拍摄的样本中进行有效的持续学习。我们通过将广泛使用的更快的RCNN检测器配备三个优雅的组件来实现这一目标。首先,为了最好地保留预先训练的基础类别的性能,我们提出了一种新颖的双重装置空间(DES)体系结构,该架构将基础和新型类别的表示形式分解为不同的空间。其次,为了减轻累积的新型类别的灾难性遗忘,我们提出了一种顺序模型融合(SMF)方法,该方法能够实现长期记忆而无需额外的存储成本。第三,为了促进特征空间中的任务间分离,我们提出了一种新颖的正则化技术,该技术将分类边界扩展到远离以前的类别以避免错误分类。总体而言,我们的框架很简单却有效,并且优于先前的SOTA,其AP性能的显着差距为2.4点。
Incremental few-shot learning is highly expected for practical robotics applications. On one hand, robot is desired to learn new tasks quickly and flexibly using only few annotated training samples; on the other hand, such new additional tasks should be learned in a continuous and incremental manner without forgetting the previous learned knowledge dramatically. In this work, we propose a novel Class-Incremental Few-Shot Object Detection (CI-FSOD) framework that enables deep object detection network to perform effective continual learning from just few-shot samples without re-accessing the previous training data. We achieve this by equipping the widely-used Faster-RCNN detector with three elegant components. Firstly, to best preserve performance on the pre-trained base classes, we propose a novel Dual-Embedding-Space (DES) architecture which decouples the representation learning of base and novel categories into different spaces. Secondly, to mitigate the catastrophic forgetting on the accumulated novel classes, we propose a Sequential Model Fusion (SMF) method, which is able to achieve long-term memory without additional storage cost. Thirdly, to promote inter-task class separation in feature space, we propose a novel regularization technique that extends the classification boundary further away from the previous classes to avoid misclassification. Overall, our framework is simple yet effective and outperforms the previous SOTA with a significant margin of 2.4 points in AP performance.