论文标题

任务平衡的单梯度步骤更新具有多任务学习

Multitask Learning with Single Gradient Step Update for Task Balancing

论文作者

Lee, Sungjae, Son, Youngdoo

论文摘要

多任务学习是一种提高概括性能并减少计算强度和内存使用情况的方法。但是,同时学习多个任务要比学习一项任务更加困难,因为它可能导致任务之间的失衡。为了解决不平衡问题,我们提出了一种算法,通过将基于梯度的元学习应用于多任务学习,以平衡梯度级别的任务。所提出的方法训练分别共享层和特定于任务的层,以便在多任务网络中具有不同角色的两个层可以适合其自己的目的。特别是,通过采用单个梯度步骤更新和内部/外环训练来减轻梯度级别的不平衡问题,可以在任务之间共享信息的共享层。我们将提出的方法应用于各种多任务计算机视觉问题并实现最新性能。

Multitask learning is a methodology to boost generalization performance and also reduce computational intensity and memory usage. However, learning multiple tasks simultaneously can be more difficult than learning a single task because it can cause imbalance among tasks. To address the imbalance problem, we propose an algorithm to balance between tasks at the gradient level by applying gradient-based meta-learning to multitask learning. The proposed method trains shared layers and task-specific layers separately so that the two layers with different roles in a multitask network can be fitted to their own purposes. In particular, the shared layer that contains informative knowledge shared among tasks is trained by employing single gradient step update and inner/outer loop training to mitigate the imbalance problem at the gradient level. We apply the proposed method to various multitask computer vision problems and achieve state-of-the-art performance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源