论文标题

通过降低双层功能冗余,改善无监督的域的适应

Improving Unsupervised Domain Adaptation by Reducing Bi-level Feature Redundancy

论文作者

Wang, Mengzhu, Zhang, Xiang, Lan, Long, Wang, Wei, Tan, Huibin, Luo, Zhigang

论文摘要

降低功能冗余显示了提高深度学习模型准确性的有益效果,因此对于无监督域适应(UDA)的模型也是必不可少的。然而,UDA领域的最新努力忽略了这一点。此外,主要方案一般意识到这一点,纯粹涉及一个域,因此可能对跨域任务无效。在本文中,我们强调了减少特征冗余对以双层方式改善UDA的重要性。对于第一级,我们尝试确保具有可转移的非相关归一化模块的紧凑型域特异性特征,该功能可保留特定的域信息,同时放大特征冗余的副作用对续集域 - 不变性。在第二级,通过替代品牌正交性进一步缓解了由域共享表示引起的域名冗余,以更好地概括。这两个新颖方面可以轻松地插入任何基于BN的主链神经网络中。具体来说,只需将它们应用于Resnet50,就可以通过五个流行的基准获得了最先进的竞争性能。我们的代码将在https://github.com/dreamkily/guda上找到。

Reducing feature redundancy has shown beneficial effects for improving the accuracy of deep learning models, thus it is also indispensable for the models of unsupervised domain adaptation (UDA). Nevertheless, most recent efforts in the field of UDA ignores this point. Moreover, main schemes realizing this in general independent of UDA purely involve a single domain, thus might not be effective for cross-domain tasks. In this paper, we emphasize the significance of reducing feature redundancy for improving UDA in a bi-level way. For the first level, we try to ensure compact domain-specific features with a transferable decorrelated normalization module, which preserves specific domain information whilst easing the side effect of feature redundancy on the sequel domain-invariance. In the second level, domain-invariant feature redundancy caused by domain-shared representation is further mitigated via an alternative brand orthogonality for better generalization. These two novel aspects can be easily plugged into any BN-based backbone neural networks. Specifically, simply applying them to ResNet50 has achieved competitive performance to the state-of-the-arts on five popular benchmarks. Our code will be available at https://github.com/dreamkily/gUDA.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源