论文标题

关于用固定级矩阵进行优化的分析:商几何视图

On the analysis of optimization with fixed-rank matrices: a quotient geometric view

论文作者

Dong, Shuyu, Gao, Bin, Huang, Wen, Gallivan, Kyle A.

论文摘要

我们研究一种通过Riemannian预处理设计的Riemannian梯度下降(RGD)算法,以优化$ \ Mathcal {M} _K^{M \ times n} $ - $ M \ times n $ n $真实矩阵的集合,具有固定的等级$ k $。我们的分析基于$ \ MATHCAL {M} _K^{M \ times n} $的商几何视图:通过用两项产品空间的商$ \ mathbb {r} _**_**^*^^{m \ times k} \ times k} \ times \ mathbb {rrrive coplant in viage bys viamans $ \ mathbb {rrria矩阵分解,我们为RGD算法的更新规则找到了一种明确的形式,该形式导致了一种新的方法,可以分析其在等级约束优化中的收敛行为。然后,我们推断出一些有趣的属性,这些特性反映了RGD如何与其他基于基于欧几里得几何形状的基质分解算法(例如这些矩阵分解算法)区分。特别是,我们表明,RGD算法不仅比欧几里得梯度下降快,而且不依赖平衡技术来确保其效率在后者时。我们进一步表明,在受限的正定性属性下,该RGD算法可以解决矩阵传感和线性收敛速率的矩阵完成问题。提供了基质传感和完成的数值实验以证明这些特性。

We study a type of Riemannian gradient descent (RGD) algorithm, designed through Riemannian preconditioning, for optimization on $\mathcal{M}_k^{m\times n}$ -- the set of $m\times n$ real matrices with a fixed rank $k$. Our analysis is based on a quotient geometric view of $\mathcal{M}_k^{m\times n}$: by identifying this set with the quotient manifold of a two-term product space $\mathbb{R}_*^{m\times k}\times \mathbb{R}_*^{n\times k}$ of matrices with full column rank via matrix factorization, we find an explicit form for the update rule of the RGD algorithm, which leads to a novel approach to analysing their convergence behavior in rank-constrained optimization. We then deduce some interesting properties that reflect how RGD distinguishes from other matrix factorization algorithms such as those based on the Euclidean geometry. In particular, we show that the RGD algorithm is not only faster than Euclidean gradient descent but also does not rely on balancing techniques to ensure its efficiency while the latter does. We further show that this RGD algorithm is guaranteed to solve matrix sensing and matrix completion problems with linear convergence rate under the restricted positive definiteness property. Numerical experiments on matrix sensing and completion are provided to demonstrate these properties.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源