论文标题

回声状态网络中的超参数调整

Hyperparameter Tuning in Echo State Networks

论文作者

Matzner, Filip

论文摘要

回声状态网络代表一种复发性神经网络,具有大型随机生成的储层和通过线性回归训练的少量读出连接。水库最常见的拓扑结构是一个完全连接的网络,该网络最多可神经元。多年来,研究人员引入了各种替代储层拓扑,例如圆网或连接线性路径。在比较不同拓扑或其他体系结构变化的性能时,必须分别对每个拓扑的超参数调整超参数,因为它们的性质可能会显着差异。通常,通过从稀疏的预定义组合网格中选择最佳性能的参数集,通常是手动进行的。不幸的是,这种方法可能导致表现不佳的配置,尤其是对于敏感拓扑。我们提出了基于协方差矩阵适应进化策略(CMA-ES)的替代方法调整。使用这种方法,我们通过数量级的顺序提高了多个拓扑比较结果,这表明单独拓扑的作用不如正确调整的超级参数那样重要。

Echo State Networks represent a type of recurrent neural network with a large randomly generated reservoir and a small number of readout connections trained via linear regression. The most common topology of the reservoir is a fully connected network of up to thousands of neurons. Over the years, researchers have introduced a variety of alternative reservoir topologies, such as a circular network or a linear path of connections. When comparing the performance of different topologies or other architectural changes, it is necessary to tune the hyperparameters for each of the topologies separately since their properties may significantly differ. The hyperparameter tuning is usually carried out manually by selecting the best performing set of parameters from a sparse grid of predefined combinations. Unfortunately, this approach may lead to underperforming configurations, especially for sensitive topologies. We propose an alternative approach of hyperparameter tuning based on the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). Using this approach, we have improved multiple topology comparison results by orders of magnitude suggesting that topology alone does not play as important role as properly tuned hyperparameters.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源