论文标题
将重力的动量嵌入估计序列中
Embedding a Heavy-Ball type of Momentum into the Estimating Sequences
论文作者
论文摘要
我们提出了一种新的基于梯度的加速方法,用于解决平滑的不受限制优化问题。目的是将重量的动量嵌入到快速梯度方法(FGM)中。为此,我们设计了估计序列的概括,该序列允许编码有关成本函数的任何形式的信息,这些信息可以有助于进一步加速最小化过程。在黑匣子框架中,我们为广义估计序列提出了一个构造,该结构是通过利用先前构造的估计功能的历史而获得的。从效率估算的角度来看,我们证明所提出方法的迭代次数的下限为$ \ MATHCAL {o} \ left(\ sqrt {\fracκ{2}}}} \ right)$。通过广泛的数值实验,关于各种类型的优化问题,通常在信号处理中处理的广泛数值实验,我们的理论结果得到了进一步的证实。合成数据集和现实数据集都用于证明我们提出的方法在降低到最佳解决方案的距离以及降低梯度的规范方面的效率。
We present a new accelerated gradient-based method for solving smooth unconstrained optimization problems. The goal is to embed a heavy-ball type of momentum into the Fast Gradient Method (FGM). For this purpose, we devise a generalization of the estimating sequences, which allows for encoding any form of information about the cost function that can aid in further accelerating the minimization process. In the black box framework, we propose a construction for the generalized estimating sequences, which is obtained by exploiting the history of the previously constructed estimating functions. From the viewpoint of efficiency estimates, we prove that the lower bound on the number of iterations for the proposed method is $\mathcal{O} \left(\sqrt{\fracκ{2}}\right)$. Our theoretical results are further corroborated by extensive numerical experiments on various types of optimization problems, often dealt within signal processing. Both synthetic and real-world datasets are utilized to demonstrate the efficiency of our proposed method in terms of decreasing the distance to the optimal solution, as well as in terms of decreasing the norm of the gradient.