论文标题

自动支配子空间挖掘用于有效的神经建筑搜索

Automated Dominative Subspace Mining for Efficient Neural Architecture Search

论文作者

Chen, Yaofo, Guo, Yong, Liao, Daihai, Lv, Fanbing, Song, Hengjie, Kwok, James Tin-Yau, Tan, Mingkui

论文摘要

神经体系结构搜索(NAS)旨在在预定义的搜索空间中自动找到有效的体系结构。但是,搜索空间通常非常大。结果,直接在如此大的搜索空间中进行搜索是非平凡的,而且非常耗时。为了解决上述问题,在每个搜索步骤中,我们试图将搜索空间限制在一个小但有效的子空间中,以提高搜索性能和搜索效率。为此,我们通过主导子空间挖掘(DSM-NAS)提出了一种新型的神经体系结构搜索方法,该方法在自动开采的子空间中找到了有希望的体系结构。具体而言,我们首先执行全局搜索,即主流挖掘,以从一组候选人中找到一个良好的子空间。然后,我们在挖掘的子空间内执行本地搜索以找到有效的体系结构。更重要的是,我们通过采用精心设计/搜索的体系结构来初始化候选子空间来进一步提高搜索性能。实验结果表明,DSM-NAS不仅降低了搜索成本,而且比各种基准搜索空间中的最先进方法发现了更好的体系结构。

Neural Architecture Search (NAS) aims to automatically find effective architectures within a predefined search space. However, the search space is often extremely large. As a result, directly searching in such a large search space is non-trivial and also very time-consuming. To address the above issues, in each search step, we seek to limit the search space to a small but effective subspace to boost both the search performance and search efficiency. To this end, we propose a novel Neural Architecture Search method via Dominative Subspace Mining (DSM-NAS) that finds promising architectures in automatically mined subspaces. Specifically, we first perform a global search, i.e ., dominative subspace mining, to find a good subspace from a set of candidates. Then, we perform a local search within the mined subspace to find effective architectures. More critically, we further boost search performance by taking well-designed/ searched architectures to initialize candidate subspaces. Experimental results demonstrate that DSM-NAS not only reduces the search cost but also discovers better architectures than state-of-the-art methods in various benchmark search spaces.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源