论文标题

使用神经普通微分方程的语义分割

Semantic Segmentation using Neural Ordinary Differential Equations

论文作者

Khoshsirat, Seyedalireza, Kambhamettu, Chandra

论文摘要

神经普通微分方程(ODE)的概念是近似函数(数据模型)而不是函数本身的衍生物。在残留网络中,而不是具有隐藏层的离散序列,而是可以通过ode参数化隐藏状态的连续动力学的导数。已经表明,这种类型的神经网络能够产生与与图像分类的等效残留网络相同的结果。在本文中,我们为语义分割任务设计了一种新颖的神经颂歌。我们以一个由剩余模块组成的基线网络开始,然后使用这些模块来构建我们的神经ode网络。我们表明,我们的神经ODE能够使用训练记忆少57%,测试记忆减少42%,参数数量减少68%。我们在CityScapes,Camvid,Lip和Pascal-Context数据集上评估了模型。

The idea of neural Ordinary Differential Equations (ODE) is to approximate the derivative of a function (data model) instead of the function itself. In residual networks, instead of having a discrete sequence of hidden layers, the derivative of the continuous dynamics of hidden state can be parameterized by an ODE. It has been shown that this type of neural network is able to produce the same results as an equivalent residual network for image classification. In this paper, we design a novel neural ODE for the semantic segmentation task. We start by a baseline network that consists of residual modules, then we use the modules to build our neural ODE network. We show that our neural ODE is able to achieve the state-of-the-art results using 57% less memory for training, 42% less memory for testing, and 68% less number of parameters. We evaluate our model on the Cityscapes, CamVid, LIP, and PASCAL-Context datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源