论文标题

在原位的内存中实现了几次持续学习,并具有动态发展的明确记忆

In-memory Realization of In-situ Few-shot Continual Learning with a Dynamically Evolving Explicit Memory

论文作者

Karunaratne, Geethan, Hersche, Michael, Langenegger, Jovin, Cherubini, Giovanni, Gallo-Bourdeau, Manuel Le, Egger, Urs, Brew, Kevin, Choi, Sam, OK, INJO, Silvestre, Mary Claire, Li, Ning, Saulnier, Nicole, Chan, Victor, Ahsan, Ishtiaq, Narayanan, Vijay, Benini, Luca, Sebastian, Abu, Rahimi, Abbas

论文摘要

从几个培训示例中不断学习新课程,而不忘记以前的旧课程需要灵活的体系结构,而不可避免地会增加部分存储,其中可以逐步存储并有效地检索新的示例和类。一个可行的架构解决方案是将固定的深神经网络紧密融合到动态发展的显式内存(EM)。作为该体系结构的核心,我们提出了一个EM单元,该单元在持续学习操作过程中利用节能中的内存计算(IMC)核心。我们首次证明了EM单元如何使用基于IMC Core上的操作(PCM)上的IMC核心操作,在推理期间进行物理上置于多个训练示例,扩展以适应看不见的类并进行相似性搜索。具体而言,通过PCM设备的原位进行性结晶实现了一些编码训练示例的物理叠加。与CIFAR-100和MiniimageNet数据集的最先进的全精度基线软件模型相比,IMC核心上达到的分类精度在1.28% - 2.5%之内,当时不断学习40个新颖的类别(从每班只有五个示例),这是60个旧类别的最佳范围。

Continually learning new classes from a few training examples without forgetting previous old classes demands a flexible architecture with an inevitably growing portion of storage, in which new examples and classes can be incrementally stored and efficiently retrieved. One viable architectural solution is to tightly couple a stationary deep neural network to a dynamically evolving explicit memory (EM). As the centerpiece of this architecture, we propose an EM unit that leverages energy-efficient in-memory compute (IMC) cores during the course of continual learning operations. We demonstrate for the first time how the EM unit can physically superpose multiple training examples, expand to accommodate unseen classes, and perform similarity search during inference, using operations on an IMC core based on phase-change memory (PCM). Specifically, the physical superposition of a few encoded training examples is realized via in-situ progressive crystallization of PCM devices. The classification accuracy achieved on the IMC core remains within a range of 1.28%--2.5% compared to that of the state-of-the-art full-precision baseline software model on both the CIFAR-100 and miniImageNet datasets when continually learning 40 novel classes (from only five examples per class) on top of 60 old classes.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源