论文标题
半监督语义分段的跨窗口一致性进行性学习
Progressive Learning with Cross-Window Consistency for Semi-Supervised Semantic Segmentation
论文作者
论文摘要
半监督语义分割的重点是探索少量标记的数据和大量未标记的数据,这更符合现实世界图像理解应用程序的需求。但是,由于无法充分有效利用未标记的图像,仍然阻碍了它。在本文中,我们揭示了交叉窗口一致性(CWC)有助于全面从未标记的数据中提取辅助监督。此外,我们提出了一个新型的CWC驱动的渐进学习框架,以通过从大量未标记的数据中挖掘出弱到紧张的约束来优化深层网络。更具体地说,本文提出了具有重要性因素的有偏见的跨窗口一致性(BCC)损失,这有助于深层网络明确限制不同窗口中重叠区域中的置信图,以保持语义一致性,以保持与较大上下文的语义一致性。此外,我们提出了一个动态的伪标签内存库(DPM),以提供高稳定性和高可靠性伪标签,以进一步优化网络。在三个代表性的城市视图,医疗场景和卫星场景的代表性数据集上进行了广泛的实验,这表明我们的框架始终超过了最先进的方法,其差距很大。代码将公开使用。
Semi-supervised semantic segmentation focuses on the exploration of a small amount of labeled data and a large amount of unlabeled data, which is more in line with the demands of real-world image understanding applications. However, it is still hindered by the inability to fully and effectively leverage unlabeled images. In this paper, we reveal that cross-window consistency (CWC) is helpful in comprehensively extracting auxiliary supervision from unlabeled data. Additionally, we propose a novel CWC-driven progressive learning framework to optimize the deep network by mining weak-to-strong constraints from massive unlabeled data. More specifically, this paper presents a biased cross-window consistency (BCC) loss with an importance factor, which helps the deep network explicitly constrain confidence maps from overlapping regions in different windows to maintain semantic consistency with larger contexts. In addition, we propose a dynamic pseudo-label memory bank (DPM) to provide high-consistency and high-reliability pseudo-labels to further optimize the network. Extensive experiments on three representative datasets of urban views, medical scenarios, and satellite scenes demonstrate our framework consistently outperforms the state-of-the-art methods with a large margin. Code will be available publicly.