论文标题

提起回归/重建网络

Lifted Regression/Reconstruction Networks

论文作者

Høier, Rasmus Kjær, Zach, Christopher

论文摘要

在这项工作中,我们提出了提升回归/重建网络(LRRN)的提升,该网络将提升的神经网络与保证的Lipschitz连续性属性相结合。解除神经网络明确优化了能量模型,以推断单位激活,因此与标准的前馈神经网络相反 - 允许层之间的双向反馈。到目前为止,已经围绕标准的馈送前架构进行了建模的神经网络。我们建议通过同时执行回归和重建,以进一步利用反馈属性。由此产生的升起的网络体系结构可以控制所需的LIPSCHITZ连续性,这是获得对抗性强大的回归和分类方法的重要功能。我们分析和数字展示了无监督和监督学习的应用。

In this work we propose lifted regression/reconstruction networks (LRRNs), which combine lifted neural networks with a guaranteed Lipschitz continuity property for the output layer. Lifted neural networks explicitly optimize an energy model to infer the unit activations and therefore---in contrast to standard feed-forward neural networks---allow bidirectional feedback between layers. So far lifted neural networks have been modelled around standard feed-forward architectures. We propose to take further advantage of the feedback property by letting the layers simultaneously perform regression and reconstruction. The resulting lifted network architecture allows to control the desired amount of Lipschitz continuity, which is an important feature to obtain adversarially robust regression and classification methods. We analyse and numerically demonstrate applications for unsupervised and supervised learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源