论文标题

使用预告片模型而无需微调的域概括

Domain Generalization using Pretrained Models without Fine-tuning

论文作者

Li, Ziyue, Ren, Kan, Jiang, Xinyang, Li, Bo, Zhang, Haipeng, Li, Dongsheng

论文摘要

微调预审计的模型是域泛化(DG)任务中的常见实践。但是,由于预审预理论的型号的尺寸不断增长,微型调整通常在计算上很昂贵。更重要的是,这可能会导致过度拟合源域并损害其概括能力,如最近的作品所示。通常,预审计的模型具有一定程度的概括能力,并且可以在特定领域和样本上实现不错的性能。但是,预算模型的概括性能在不同的测试域甚至样品上可能会有很大差异,这为我们带来了挑战,以最好地利用DG任务中的预贴预认型模型。在本文中,我们提出了一种新颖的领域泛化范式,以更好地利用各种预验证的模型,称为“专门集合学习”域泛化(SEDGE)。它首先在固定的经过预告片的模型上训练线性标签空间适配器,这将预告片的模型的输出转换为目标域的标签空间。然后,提出了一个集合网络了解模型专业,以动态调度适当的预验证模型来预测每个测试样本。对几个基准测试的实验研究表明,与强大的基准相比,SEDGE可取得重大的性能改进,包括DG任务中的最新方法,并将可训练的参数降低约99%,培训时间降低了约99.5%。

Fine-tuning pretrained models is a common practice in domain generalization (DG) tasks. However, fine-tuning is usually computationally expensive due to the ever-growing size of pretrained models. More importantly, it may cause over-fitting on source domain and compromise their generalization ability as shown in recent works. Generally, pretrained models possess some level of generalization ability and can achieve decent performance regarding specific domains and samples. However, the generalization performance of pretrained models could vary significantly over different test domains even samples, which raises challenges for us to best leverage pretrained models in DG tasks. In this paper, we propose a novel domain generalization paradigm to better leverage various pretrained models, named specialized ensemble learning for domain generalization (SEDGE). It first trains a linear label space adapter upon fixed pretrained models, which transforms the outputs of the pretrained model to the label space of the target domain. Then, an ensemble network aware of model specialty is proposed to dynamically dispatch proper pretrained models to predict each test sample. Experimental studies on several benchmarks show that SEDGE achieves significant performance improvements comparing to strong baselines including state-of-the-art method in DG tasks and reduces the trainable parameters by ~99% and the training time by ~99.5%.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源