论文标题

非凸优化的随机子空间方法,并应用于非线性最小二乘

Randomised subspace methods for non-convex optimization, with applications to nonlinear least-squares

论文作者

Cartis, Coralia, Fowkes, Jaroslav, Shao, Zhen

论文摘要

我们提出了一个通用的随机子空间框架,用于不受限制的非convex优化问题,该框架需要对子空间梯度的弱概率假设,我们证明,使用Johnson-Lindenstrauss嵌入性能,我们证明了各种随机矩阵集合(例如高斯和稀疏素描)的各种随机矩阵合奏。我们表明,当通过信任区域或二次正则化保护时,这种随机的子空间方法满足了较高的概率,即顺序$ \ MATHCAL {O}(ε^{ - 2})$的复杂性限制,以使(完整的)梯度低于$ε$;按照精度顺序匹配,这些方法的确定性对应物并确保几乎确定收敛。此外,在草图矩阵的投影大小中,没有问题维度依赖性出现,从而可以选择低维子空间。我们将此框架特定于非线性最小二乘问题的随机子空间高斯 - 纽顿(RS-GN)方法,仅需要计算子空间中的jacobian;具有相似的复杂性保证。还提出了使用最佳非线性最小二乘正方形上的RS-GN进行数值实验,并有一些令人鼓舞的结果。

We propose a general random subspace framework for unconstrained nonconvex optimization problems that requires a weak probabilistic assumption on the subspace gradient, which we show to be satisfied by various random matrix ensembles, such as Gaussian and sparse sketching, using Johnson-Lindenstrauss embedding properties. We show that, when safeguarded with trust region or quadratic regularization, this random subspace approach satisfies, with high probability, a complexity bound of order $\mathcal{O}(ε^{-2})$ to drive the (full) gradient below $ε$; matching in the accuracy order, deterministic counterparts of these methods and securing almost sure convergence. Furthermore, no problem dimension dependence appears explicitly in the projection size of the sketching matrix, allowing the choice of low-dimensional subspaces. We particularise this framework to Random Subspace Gauss-Newton (RS-GN) methods for nonlinear least squares problems, that only require the calculation of the Jacobian in the subspace; with similar complexity guarantees. Numerical experiments with RS-GN on CUTEst nonlinear least squares are also presented, with some encouraging results.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源