论文标题

POGD:带有新随机规则的梯度下降

POGD: Gradient Descent with New Stochastic Rules

论文作者

Han, Feihu, Xing, Sida, Khoo, Sui Yang

论文摘要

在那里引入了粒子优化的梯度下降(POGD),这是一种基于梯度下降的算法,但整合了粒子群优化(PSO)原理以实现迭代。从实验中,该算法具有自适应学习能力。本文的实验主要集中于训练速度,以达到目标价值和预防本地最小值的能力。本文中的实验是通过MNIST和CIFAR-10数据集上的卷积神经网络(CNN)图像分类来实现的。

There introduce Particle Optimized Gradient Descent (POGD), an algorithm based on the gradient descent but integrates the particle swarm optimization (PSO) principle to achieve the iteration. From the experiments, this algorithm has adaptive learning ability. The experiments in this paper mainly focus on the training speed to reach the target value and the ability to prevent the local minimum. The experiments in this paper are achieved by the convolutional neural network (CNN) image classification on the MNIST and cifar-10 datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源