论文标题

使用正交时刻学习对比的神经网络的不变表示形式

Learning Invariant Representations for Equivariant Neural Networks Using Orthogonal Moments

论文作者

Singh, Jaspreet, Singh, Chandan

论文摘要

标准卷积神经网络(CNN)的卷积层与翻译一样。但是,卷积和完全连接的层与其他仿射几何变换并不是等等的或不变的。最近,提出了一类新的CNN,其中CNN的常规层被汇总卷积,合并和批量归一化层代替。 eprovariant神经网络中的最终分类层与不同的仿射几何变换(例如旋转,反射和翻译)不变,并且标量值是通过消除绕线响应的空间维度,使用卷积和下降降采样,或在整个网络中取消滤波器响应的空间维度,或者在滤波器响应中取得平均水平。在这项工作中,我们建议整合正交力矩,该矩将功能的高阶统计数据作为编码全局不变性在旋转,反射和翻译中的有效手段。结果,网络的中间层变得模棱两可,而分类层变得不变。出于这个目的,考虑使用最广泛使用的Zernike,伪用和正交的傅立叶粉刺矩。通过在旋转的MNIST和CIFAR10数据集上集成了组等级CNN(G-CNN)的体系结构中的不变过渡和完全连接的层来评估所提出的工作的有效性。

The convolutional layers of standard convolutional neural networks (CNNs) are equivariant to translation. However, the convolution and fully-connected layers are not equivariant or invariant to other affine geometric transformations. Recently, a new class of CNNs is proposed in which the conventional layers of CNNs are replaced with equivariant convolution, pooling, and batch-normalization layers. The final classification layer in equivariant neural networks is invariant to different affine geometric transformations such as rotation, reflection and translation, and the scalar value is obtained by either eliminating the spatial dimensions of filter responses using convolution and down-sampling throughout the network or average is taken over the filter responses. In this work, we propose to integrate the orthogonal moments which gives the high-order statistics of the function as an effective means for encoding global invariance with respect to rotation, reflection and translation in fully-connected layers. As a result, the intermediate layers of the network become equivariant while the classification layer becomes invariant. The most widely used Zernike, pseudo-Zernike and orthogonal Fourier-Mellin moments are considered for this purpose. The effectiveness of the proposed work is evaluated by integrating the invariant transition and fully-connected layer in the architecture of group-equivariant CNNs (G-CNNs) on rotated MNIST and CIFAR10 datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源