论文标题

通过本地采样对张量网络结构进行置换搜索

Permutation Search of Tensor Network Structures via Local Sampling

论文作者

Li, Chao, Zeng, Junhua, Tao, Zerui, Zhao, Qibin

论文摘要

最近的工作为张量网络结构搜索(TN-SS)付出了很多努力,旨在选择合适的张量网络(TN)结构,涉及TN级别,格式等,以进行分解或学习任务。在本文中,我们考虑了TN-SS的实用变体,称为TN置换搜索(TN-PS),其中我们从张量模式中搜索了良好的映射到TN顶点(核心张量)以进行紧凑的TN表示。我们对TN-PS进行了理论研究,并提出了一种实际效率的算法来解决该问题。从理论上讲,我们证明了TN-PS的搜索空间的计数和度量属性,首次分析了TN结构对这些独特属性的影响。从数字上讲,我们提出了一种新颖的元元素算法,其中搜索是通过在我们的理论中建立的邻域中随机采样来完成的,然后将其反复更新邻域直至收敛。数值结果表明,新算法可以在广泛的基准中降低TNS的所需模型大小,这意味着TNS的表达能力的提高。此外,新算法的计算成本明显少于〜\ cite {li2020进化}中的计算成本。

Recent works put much effort into tensor network structure search (TN-SS), aiming to select suitable tensor network (TN) structures, involving the TN-ranks, formats, and so on, for the decomposition or learning tasks. In this paper, we consider a practical variant of TN-SS, dubbed TN permutation search (TN-PS), in which we search for good mappings from tensor modes onto TN vertices (core tensors) for compact TN representations. We conduct a theoretical investigation of TN-PS and propose a practically-efficient algorithm to resolve the problem. Theoretically, we prove the counting and metric properties of search spaces of TN-PS, analyzing for the first time the impact of TN structures on these unique properties. Numerically, we propose a novel meta-heuristic algorithm, in which the searching is done by randomly sampling in a neighborhood established in our theory, and then recurrently updating the neighborhood until convergence. Numerical results demonstrate that the new algorithm can reduce the required model size of TNs in extensive benchmarks, implying the improvement in the expressive power of TNs. Furthermore, the computational cost for the new algorithm is significantly less than that in~\cite{li2020evolutionary}.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源