论文标题

解开深度学习的卷公式

Disentangling a Deep Learned Volume Formula

论文作者

Craven, Jessica, Jejjala, Vishnu, Kar, Arjun

论文摘要

我们提出了一个简单的现象学公式,该公式仅使用其统一根部的琼斯多项式评估来近似结的双曲线体积。平均误差仅为首批170万美元的2.86美元,这比以前的公式相比有了很大的改善。为了找到近似公式,我们使用层面相关性的传播来反向设计一个黑匣子神经网络,当接受总计$ 10 $%的$ 10 $%时,该网络在同一近似任务中达到了相似的平均误差。我们分析中出现的特定统一根不能写为$ e^{2πi /(k+2)} $,带有整数$ k $;因此,相关的琼斯多项式评估未由常规$ su(2)$ chern $ \ unicode {x2013} $ simons理论带有级别$ k $中的Wilson Loop运营商的未开关的期望值给出。相反,它们对应于将这种期望值对分数水平的分析延续。我们简要回顾了连续过程,并评论某些Lefschetz Thimbles的存在,我们的近似公式对此非常敏感,在分析继续的Chern $ \ Unicode {x2013} $ Simons集成周期中。

We present a simple phenomenological formula which approximates the hyperbolic volume of a knot using only a single evaluation of its Jones polynomial at a root of unity. The average error is just $2.86$% on the first $1.7$ million knots, which represents a large improvement over previous formulas of this kind. To find the approximation formula, we use layer-wise relevance propagation to reverse engineer a black box neural network which achieves a similar average error for the same approximation task when trained on $10$% of the total dataset. The particular roots of unity which appear in our analysis cannot be written as $e^{2πi / (k+2)}$ with integer $k$; therefore, the relevant Jones polynomial evaluations are not given by unknot-normalized expectation values of Wilson loop operators in conventional $SU(2)$ Chern$\unicode{x2013}$Simons theory with level $k$. Instead, they correspond to an analytic continuation of such expectation values to fractional level. We briefly review the continuation procedure and comment on the presence of certain Lefschetz thimbles, to which our approximation formula is sensitive, in the analytically continued Chern$\unicode{x2013}$Simons integration cycle.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源