论文标题

条件排列不变流动

Conditional Permutation Invariant Flows

论文作者

Zwartsenberg, Berend, Ścibior, Adam, Niedoba, Matthew, Lioutas, Vasileios, Liu, Yunpeng, Sefas, Justice, Dabiri, Setareh, Lavington, Jonathan Wilder, Campbell, Trevor, Wood, Frank

论文摘要

我们提出了一种具有可拖动的对数密度的集合数据值数据的新颖,有条件的生成概率模型。该模型是由置换模化动力学控制的连续归一化流。这些动力学是由可学习的每集元素项和成对相互作用的驱动的,均通过深神经网络参数化。我们通过应用程序说明了该模型的实用性,包括(1)以视觉上指定的地图信息为条件的复杂交通场景生成,以及(2)直接在图像上调节的对象边界框的生成。我们借助确保动力学平稳并有效解决的惩罚来训练我们的模型,从而最大程度地提高了在流动下标记有条件数据的预期可能性。我们的方法在对数的可能性和特定于域特异性指标(越野,碰撞和违规行为)方面极大地超过了非渗透不变基线的,从而产生了很难与真实数据区分的现实样本。

We present a novel, conditional generative probabilistic model of set-valued data with a tractable log density. This model is a continuous normalizing flow governed by permutation equivariant dynamics. These dynamics are driven by a learnable per-set-element term and pairwise interactions, both parametrized by deep neural networks. We illustrate the utility of this model via applications including (1) complex traffic scene generation conditioned on visually specified map information, and (2) object bounding box generation conditioned directly on images. We train our model by maximizing the expected likelihood of labeled conditional data under our flow, with the aid of a penalty that ensures the dynamics are smooth and hence efficiently solvable. Our method significantly outperforms non-permutation invariant baselines in terms of log likelihood and domain-specific metrics (offroad, collision, and combined infractions), yielding realistic samples that are difficult to distinguish from real data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源