论文标题

在Riemannian歧管上的最小值问题的梯度下降

Gradient Descent Ascent for Minimax Problems on Riemannian Manifolds

论文作者

Huang, Feihu, Gao, Shangqian

论文摘要

在本文中,我们研究了关于Riemanian流形的一类有用的最小问题,并提出了一类有效的基于Riemanian梯度的方法来解决这些最小问题。具体而言,我们提出了一种有效的Riemannian梯度下降(RGDA)算法,以进行确定性的最小值优化。此外,我们证明我们的RGDA的样本复杂性为$ O(κ^2ε^{ - 2})$,用于寻找$ε$ - 固定解决方案的$ε$ - 固定解决方案,即concave(GNSC)minimax问题,其中$κ$表示条件编号。同时,我们提出了一种有效的Riemannian随机梯度下降(RSGDA)算法,用于随机最小值优化,该算法的样品复杂性为$ O(κ^4ε^{ - 4})$,用于查找$ε$ $ $ - $ $ - 计划解决方案。为了进一步降低样品的复杂性,我们提出了一种基于基于动量的方差降低技术的加速riemannian随机梯度下降(ACC-RSGDA)算法。我们证明,我们的ACC-RSGDA算法达到了$ \ tilde {o}(κ^{4}ε^{ - 3})$较低的样本复杂性,以搜索GNSC Minimax问题的$ε$ - 稳定解决方案。对Stiefel歧管上强大的分布优化和可靠的深神经网络(DNN)训练的广泛实验结果表明我们的算法效率。

In the paper, we study a class of useful minimax problems on Riemanian manifolds and propose a class of effective Riemanian gradient-based methods to solve these minimax problems. Specifically, we propose an effective Riemannian gradient descent ascent (RGDA) algorithm for the deterministic minimax optimization. Moreover, we prove that our RGDA has a sample complexity of $O(κ^2ε^{-2})$ for finding an $ε$-stationary solution of the Geodesically-Nonconvex Strongly-Concave (GNSC) minimax problems, where $κ$ denotes the condition number. At the same time, we present an effective Riemannian stochastic gradient descent ascent (RSGDA) algorithm for the stochastic minimax optimization, which has a sample complexity of $O(κ^4ε^{-4})$ for finding an $ε$-stationary solution. To further reduce the sample complexity, we propose an accelerated Riemannian stochastic gradient descent ascent (Acc-RSGDA) algorithm based on the momentum-based variance-reduced technique. We prove that our Acc-RSGDA algorithm achieves a lower sample complexity of $\tilde{O}(κ^{4}ε^{-3})$ in searching for an $ε$-stationary solution of the GNSC minimax problems. Extensive experimental results on the robust distributional optimization and robust Deep Neural Networks (DNNs) training over Stiefel manifold demonstrate efficiency of our algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源