论文标题

紧密的较低复杂性边界,以凸出有限的和有限的优化

Tight Lower Complexity Bounds for Strongly Convex Finite-Sum Optimization

论文作者

Zhang, Min, Shu, Yao, He, Kun

论文摘要

有限和优化在机器学习领域起着重要作用,因此近年来引起了人们的兴趣。为了解决这个优化问题,已经提出了各种随机增量梯度方法,并保证了其收敛性上下复杂性范围。但是,这些下限依赖于某些条件:确定性优化算法或用于选择组件函数的固定概率分布。同时,在某些情况下,某些下限甚至与最著名方法的上限不匹配。为了打破这些局限性,我们得出了两个典型的有限-SUM优化情况,包括随机增量梯度方法(包括SAG,SAGA,SVRG和SARAH)的紧密较低的复杂性界限。具体而言,当每个组件函数都强烈凸面且平稳时,我们的结果与Katyusha或Vrada的上部复杂性紧密匹配,并且当有限-SUM函数强烈凸出并且组件函数平均光滑时,无偶性和katyushax的SDCA的上部复杂性紧密匹配。

Finite-sum optimization plays an important role in the area of machine learning, and hence has triggered a surge of interest in recent years. To address this optimization problem, various randomized incremental gradient methods have been proposed with guaranteed upper and lower complexity bounds for their convergence. Nonetheless, these lower bounds rely on certain conditions: deterministic optimization algorithm, or fixed probability distribution for the selection of component functions. Meanwhile, some lower bounds even do not match the upper bounds of the best known methods in certain cases. To break these limitations, we derive tight lower complexity bounds of randomized incremental gradient methods, including SAG, SAGA, SVRG, and SARAH, for two typical cases of finite-sum optimization. Specifically, our results tightly match the upper complexity of Katyusha or VRADA when each component function is strongly convex and smooth, and tightly match the upper complexity of SDCA without duality and of KatyushaX when the finite-sum function is strongly convex and the component functions are average smooth.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源