论文标题

使用自然进化策略学习离散的结构化变异自动编码器

Learning Discrete Structured Variational Auto-Encoder using Natural Evolution Strategies

论文作者

Berliner, Alon, Rotman, Guy, Adi, Yossi, Reichart, Roi, Hazan, Tamir

论文摘要

离散的变性自动编码器(VAE)能够代表生成学习中的语义潜在空间。在许多现实生活中,离散的潜在空间由高维结构组成,并且通过相关结构传播梯度通常需要在指数型的大型潜在空间上进行枚举。最近,设计了各种方法来传播近似梯度,而无需在可能的结构的空间上枚举。在这项工作中,我们使用自然进化策略(NES),一类无梯度的黑盒优化算法来学习离散的结构化VAE。 NES算法在计算上具有吸引力,因为它们仅通过前向通过评估来估计梯度,因此它们不需要通过离散结构来传播梯度。我们从经验上证明,使用NES优化离散的结构化VAE与基于梯度的近似值一样有效。最后,我们证明NES在离散结构化的VAE中汇聚为非lipschitz函数。

Discrete variational auto-encoders (VAEs) are able to represent semantic latent spaces in generative learning. In many real-life settings, the discrete latent space consists of high-dimensional structures, and propagating gradients through the relevant structures often requires enumerating over an exponentially large latent space. Recently, various approaches were devised to propagate approximated gradients without enumerating over the space of possible structures. In this work, we use Natural Evolution Strategies (NES), a class of gradient-free black-box optimization algorithms, to learn discrete structured VAEs. The NES algorithms are computationally appealing as they estimate gradients with forward pass evaluations only, thus they do not require to propagate gradients through their discrete structures. We demonstrate empirically that optimizing discrete structured VAEs using NES is as effective as gradient-based approximations. Lastly, we prove NES converges for non-Lipschitz functions as appear in discrete structured VAEs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源