论文标题

有效地逃脱了双层优化的鞍点

Efficiently Escaping Saddle Points in Bilevel Optimization

论文作者

Huang, Minhui, Chen, Xuxing, Ji, Kaiyi, Ma, Shiqian, Lai, Lifeng

论文摘要

二元优化是机器学习和优化中的基本问题之一。双重优化的最新理论发展集中在寻找非convex-rong-convex情况的一阶固定点。在本文中,我们分析了可以在非convex-rong-convex双杆优化中逃脱鞍点的算法。具体而言,我们表明,使用温暖的开始策略扰动的近似隐式分化(AID)可以发现$ε$ - 在$ \ tilde {o}中的双层优化的局部最小值(ε^{ - 2})$具有很高的可能性。此外,我们提出了一种不精确的负面源自渗透算法(ineon),这是一种纯的一阶算法,可以逃脱鞍点并找到局部最小的随机双光线优化。作为副产品,我们提供了第一个对扰动的多步梯度下降(GDMAX)算法的非肌电分析,该算法会收敛到Minimax问题的局部minimax点。

Bilevel optimization is one of the fundamental problems in machine learning and optimization. Recent theoretical developments in bilevel optimization focus on finding the first-order stationary points for nonconvex-strongly-convex cases. In this paper, we analyze algorithms that can escape saddle points in nonconvex-strongly-convex bilevel optimization. Specifically, we show that the perturbed approximate implicit differentiation (AID) with a warm start strategy finds $ε$-approximate local minimum of bilevel optimization in $\tilde{O}(ε^{-2})$ iterations with high probability. Moreover, we propose an inexact NEgative-curvature-Originated-from-Noise Algorithm (iNEON), a pure first-order algorithm that can escape saddle point and find local minimum of stochastic bilevel optimization. As a by-product, we provide the first nonasymptotic analysis of perturbed multi-step gradient descent ascent (GDmax) algorithm that converges to local minimax point for minimax problems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源