论文标题
非平滑凸优化的全球收敛近端牛顿型方法
A Globally Convergent Proximal Newton-Type Method in Nonsmooth Convex Optimization
论文作者
论文摘要
本文提出并证明了近端牛顿类型的一种新算法,以解决一系列非平滑复合凸优化问题,而没有强大的凸度假设。根据高级分析的概念和技术,我们就提出的算法的全球收敛性以及其局部收敛及其与超线性和二次率的局部收敛建立了可实施的结果。对于某些结构化问题,所获得的局部收敛条件不需要相应的Hessian映射的局部Lipschitz连续性,这是文献中使用的至关重要的假设,以确保牛顿近端类型的其他算法的超线性收敛性。解决$ L_1 $正规逻辑回归模型的数值实验说明了应用所提出的算法来处理实际重要问题的可能性。
The paper proposes and justifies a new algorithm of the proximal Newton type to solve a broad class of nonsmooth composite convex optimization problems without strong convexity assumptions. Based on advanced notions and techniques of variational analysis, we establish implementable results on the global convergence of the proposed algorithm as well as its local convergence with superlinear and quadratic rates. For certain structured problems, the obtained local convergence conditions do not require the local Lipschitz continuity of the corresponding Hessian mappings that is a crucial assumption used in the literature to ensure a superlinear convergence of other algorithms of the proximal Newton type. The conducted numerical experiments of solving the $l_1$ regularized logistic regression model illustrate the possibility of applying the proposed algorithm to deal with practically important problems.