论文标题

使用非光现实图像的伪排练

Pseudo Rehearsal using non photo-realistic images

论文作者

Suri, Bhasker Sri Harsha, Yeturu, Kalidas

论文摘要

深度神经网络在学习新任务时忘记了以前学习的任务。这被称为灾难性遗忘。通过先前任务的训练数据对神经网络进行排练可以保护网络免受灾难性遗忘。由于排练需要存储所有以前的数据,因此提出了伪排练,其中属于先前数据的样本是为了彩排而生成的。在图像分类设置中,虽然当前技术试图生成照片真实性的综合数据,但我们证明了神经网络可以在不是光真实的数据上进行排练,并且仍然可以良好地保留以前的任务。我们还证明,在生成的数据中放置照片现实主义的限制可能会导致伪排练的计算和内存资源消耗量大大减少。

Deep Neural networks forget previously learnt tasks when they are faced with learning new tasks. This is called catastrophic forgetting. Rehearsing the neural network with the training data of the previous task can protect the network from catastrophic forgetting. Since rehearsing requires the storage of entire previous data, Pseudo rehearsal was proposed, where samples belonging to the previous data are generated synthetically for rehearsal. In an image classification setting, while current techniques try to generate synthetic data that is photo-realistic, we demonstrated that Neural networks can be rehearsed on data that is not photo-realistic and still achieve good retention of the previous task. We also demonstrated that forgoing the constraint of having photo realism in the generated data can result in a significant reduction in the consumption of computational and memory resources for pseudo rehearsal.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源