论文标题
低度多项式的张量分解的平均情况复杂性
Average-Case Complexity of Tensor Decomposition for Low-Degree Polynomials
论文作者
论文摘要
假设我们给出了(\ Mathbb {r}^n)^{\ otimes 3} $,这是$ r $ r $ r $ randan rang rank-1项。当$ r \ lyssim n^2 $但多项式时间算法仅在款式$ r \ ll n^{3/2} $中知道时,恢复级别1组件的问题原则上可能是可能的。 Similar "statistical-computational gaps" occur in many high-dimensional inference tasks, and in recent years there has been a flurry of work on explaining the apparent computational hardness in these problems by proving lower bounds against restricted (yet powerful) models of computation such as statistical queries (SQ), sum-of-squares (SoS), and low-degree polynomials (LDP).但是,张张量分解尚无此类先前的工作,这主要是因为它的硬度似乎并未通过“种植与无效”测试问题来解释。 我们考虑了一个随机订单-3张量分解的模型,其中一个组件的标准比其余部分稍大(打破对称性),并且组件是从HyperCube均匀绘制的。我们解决了LDP模型中的计算复杂性:张量输入的$ O(\ log n)$ - 度多项式函数可以准确估计最大的组件时,当$ r \ ll n^{3/2} $,但是当$ \ r \ gg gg n^{3/2} $。这提供了严格的证据表明,至少通过已知方法,无法改善最著名的张量分解算法。该结果的自然扩展可容纳任何固定订单$ k \ ge 3 $的张量,在这种情况下,LDP阈值为$ r \ sim n^{k/2} $。
Suppose we are given an $n$-dimensional order-3 symmetric tensor $T \in (\mathbb{R}^n)^{\otimes 3}$ that is the sum of $r$ random rank-1 terms. The problem of recovering the rank-1 components is possible in principle when $r \lesssim n^2$ but polynomial-time algorithms are only known in the regime $r \ll n^{3/2}$. Similar "statistical-computational gaps" occur in many high-dimensional inference tasks, and in recent years there has been a flurry of work on explaining the apparent computational hardness in these problems by proving lower bounds against restricted (yet powerful) models of computation such as statistical queries (SQ), sum-of-squares (SoS), and low-degree polynomials (LDP). However, no such prior work exists for tensor decomposition, largely because its hardness does not appear to be explained by a "planted versus null" testing problem. We consider a model for random order-3 tensor decomposition where one component is slightly larger in norm than the rest (to break symmetry), and the components are drawn uniformly from the hypercube. We resolve the computational complexity in the LDP model: $O(\log n)$-degree polynomial functions of the tensor entries can accurately estimate the largest component when $r \ll n^{3/2}$ but fail to do so when $r \gg n^{3/2}$. This provides rigorous evidence suggesting that the best known algorithms for tensor decomposition cannot be improved, at least by known approaches. A natural extension of the result holds for tensors of any fixed order $k \ge 3$, in which case the LDP threshold is $r \sim n^{k/2}$.