论文标题

神经隐式词典通过专家训练混合物

Neural Implicit Dictionary via Mixture-of-Expert Training

论文作者

Wang, Peihao, Fan, Zhiwen, Chen, Tianlong, Wang, Zhangyang

论文摘要

与基于离散网格的表示相比,通过基于坐标的深层完全连接的网络表示视觉信号在拟合复杂的细节和解决反问题方面已显示出优势。但是,获得这种连续的隐式神经表示(INR)需要对大量信号测量的繁琐的人均培训,这限制了其实用性。在本文中,我们提出了一个通用的INR框架,该框架通过从数据收集中学习神经隐式词典(NID)来实现数据和培训效率,并将INR表示为词典采样的基础功能组合。我们的NID组装了一组基于坐标的子网,这些子网被调整为跨越所需的函数空间。训练后,可以通过求解编码系数立即,稳健地获得看不见的场景表示形式。为了使大量网络优化,我们借用了Experter(MOE)的想法,以设计和训练我们的网络使用稀疏的门控机制。我们的实验表明,NID可以将2D图像或3D场景的重建提高2个数量级,而输入数据少98%。我们进一步证明了NID在图像填充和遮挡清除中的各种应用,这被认为对香草INR具有挑战性。我们的代码可在https://github.com/vita-group/neural-implitic-dict中找到。

Representing visual signals by coordinate-based deep fully-connected networks has been shown advantageous in fitting complex details and solving inverse problems than discrete grid-based representation. However, acquiring such a continuous Implicit Neural Representation (INR) requires tedious per-scene training on tons of signal measurements, which limits its practicality. In this paper, we present a generic INR framework that achieves both data and training efficiency by learning a Neural Implicit Dictionary (NID) from a data collection and representing INR as a functional combination of basis sampled from the dictionary. Our NID assembles a group of coordinate-based subnetworks which are tuned to span the desired function space. After training, one can instantly and robustly acquire an unseen scene representation by solving the coding coefficients. To parallelly optimize a large group of networks, we borrow the idea from Mixture-of-Expert (MoE) to design and train our network with a sparse gating mechanism. Our experiments show that, NID can improve reconstruction of 2D images or 3D scenes by 2 orders of magnitude faster with up to 98% less input data. We further demonstrate various applications of NID in image inpainting and occlusion removal, which are considered to be challenging with vanilla INR. Our codes are available in https://github.com/VITA-Group/Neural-Implicit-Dict.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源