论文标题
加权随机搜索CNN高参数优化
Weighted Random Search for CNN Hyperparameter Optimization
论文作者
论文摘要
机器学习中使用的几乎所有模型算法都使用两组不同的参数:训练参数和元参数(超参数)。虽然在训练阶段学习了训练参数,但必须在学习开始之前指定超参数的值。对于给定的数据集,我们想在合理的时间内找到超参数值的最佳组合。由于其计算复杂性,这是一项具有挑战性的任务。在先前的工作[11]中,我们介绍了加权随机搜索(WRS)方法,即随机搜索(RS)和概率贪婪的启发式方法的组合。在当前论文中,我们将WRS方法与几种最先进的超参数优化方法相对于卷积神经网络(CNN)超参数优化进行了比较。标准是在相同数量的高参数值组合中达到的分类精度。根据我们的实验,WRS算法的表现优于其他方法。
Nearly all model algorithms used in machine learning use two different sets of parameters: the training parameters and the meta-parameters (hyperparameters). While the training parameters are learned during the training phase, the values of the hyperparameters have to be specified before learning starts. For a given dataset, we would like to find the optimal combination of hyperparameter values, in a reasonable amount of time. This is a challenging task because of its computational complexity. In previous work [11], we introduced the Weighted Random Search (WRS) method, a combination of Random Search (RS) and probabilistic greedy heuristic. In the current paper, we compare the WRS method with several state-of-the art hyperparameter optimization methods with respect to Convolutional Neural Network (CNN) hyperparameter optimization. The criterion is the classification accuracy achieved within the same number of tested combinations of hyperparameter values. According to our experiments, the WRS algorithm outperforms the other methods.