论文标题
ADACM:实时通用照片真实风格转移的自适应Colormlp
AdaCM: Adaptive ColorMLP for Real-Time Universal Photo-realistic Style Transfer
论文作者
论文摘要
照片现实风格的转移旨在将艺术风格从示例风格的图像迁移到内容图像,从而在没有空间扭曲或不切实际的文物的情况下产生结果图像。最近的深层模型已经取得了令人印象深刻的结果。但是,基于神经网络的深度方法太昂贵了,无法实时运行。同时,基于双边网格的方法要快得多,但仍包含诸如暴露过度的伪影。在这项工作中,我们提出了\ textbf {自适应colormlp(adaCm)},这是通用照片真实风格转移的有效且有效的框架。首先,我们发现在输入和目标域之间的复杂的非线性颜色映射可以通过小型多层感知器(COLORMLP)模型有效建模。然后,在\ textbf {adacm}中,我们采用CNN编码器来自适应预测在每个输入内容和样式图像对条件下的colormlp的所有参数。实验结果表明,ADACM可以产生生动和高质量的风格化结果。同时,我们的ADACM是超快速的,可以在一个V100 GPU上以6ms的6ms处理4K分辨率图像。
Photo-realistic style transfer aims at migrating the artistic style from an exemplar style image to a content image, producing a result image without spatial distortions or unrealistic artifacts. Impressive results have been achieved by recent deep models. However, deep neural network based methods are too expensive to run in real-time. Meanwhile, bilateral grid based methods are much faster but still contain artifacts like overexposure. In this work, we propose the \textbf{Adaptive ColorMLP (AdaCM)}, an effective and efficient framework for universal photo-realistic style transfer. First, we find the complex non-linear color mapping between input and target domain can be efficiently modeled by a small multi-layer perceptron (ColorMLP) model. Then, in \textbf{AdaCM}, we adopt a CNN encoder to adaptively predict all parameters for the ColorMLP conditioned on each input content and style image pair. Experimental results demonstrate that AdaCM can generate vivid and high-quality stylization results. Meanwhile, our AdaCM is ultrafast and can process a 4K resolution image in 6ms on one V100 GPU.