论文标题
基于结构关系图表示学习的无监督多模式变更检测
Unsupervised Multimodal Change Detection Based on Structural Relationship Graph Representation Learning
论文作者
论文摘要
无监督的多模式变更检测是一个实用且具有挑战性的话题,可以在时间敏感的紧急应用中发挥重要作用。为了解决多模式遥感图像由于模态异质性无法直接比较的挑战,我们利用了多模式图像中两种独立于模态的结构关系。特别是,我们提出了一个结构关系图表示学习框架,用于衡量两个结构关系的相似性。首先,结构图是通过基于对象的图像分析方法从预处理的多模式图对生成的。然后,提出了一个结构关系图卷积自动编码器(SR-GCAE),以从图中学习鲁棒和代表性的特征。提出了旨在重建顶点信息和边缘信息的两个损失功能,以使学习的表示形式适用于结构关系相似性测量。随后,从学习的图表中计算出两个结构关系的相似性水平,并根据相似性水平生成两个差异图像。获得差异图像后,提出了一种自适应融合策略来融合两个差异图像。最后,采用基于形态过滤的后处理方法来完善检测结果。具有不同模态组合的五个数据集的实验结果证明了该方法的有效性。
Unsupervised multimodal change detection is a practical and challenging topic that can play an important role in time-sensitive emergency applications. To address the challenge that multimodal remote sensing images cannot be directly compared due to their modal heterogeneity, we take advantage of two types of modality-independent structural relationships in multimodal images. In particular, we present a structural relationship graph representation learning framework for measuring the similarity of the two structural relationships. Firstly, structural graphs are generated from preprocessed multimodal image pairs by means of an object-based image analysis approach. Then, a structural relationship graph convolutional autoencoder (SR-GCAE) is proposed to learn robust and representative features from graphs. Two loss functions aiming at reconstructing vertex information and edge information are presented to make the learned representations applicable for structural relationship similarity measurement. Subsequently, the similarity levels of two structural relationships are calculated from learned graph representations and two difference images are generated based on the similarity levels. After obtaining the difference images, an adaptive fusion strategy is presented to fuse the two difference images. Finally, a morphological filtering-based postprocessing approach is employed to refine the detection results. Experimental results on five datasets with different modal combinations demonstrate the effectiveness of the proposed method.