论文标题
变形金刚符合随机块模型:具有数据自适应稀疏性和成本的关注
Transformers meet Stochastic Block Models: Attention with Data-Adaptive Sparsity and Cost
论文作者
论文摘要
为了克服自我注意力的二次成本,最近的作品提出了各种稀疏注意模块,其中大多数属于两组之一:1)在手工制作的模式下稀疏的注意力和2)全部关注,然后是稀疏的Softmax,例如$α$ -Entmax。不幸的是,第一组缺乏对数据的适应性,而第二组仍需要二次培训成本。在这项工作中,我们提出了SBM-Transformer,该模型通过使用混合成员的随机块模型(SBM)来赋予每个注意力头来解决这两个问题。然后,每个注意力头数据自动示例二分图,其邻接用作每个输入的注意力掩码。在反向传播期间,直通估计器用于流动梯度超出离散采样步骤,并根据预测损失调整采样边缘的概率。因此,向前和向后的成本是线性与边数的线性,每个注意力头也可以根据输入灵活地选择。通过评估图的分布,我们从理论上表明SBM变形器是期望中任意序列到序列函数的通用近似器。 LRA和胶水基准下的经验评估表明,我们的模型的表现优于先前的有效变体以及原始变压器。我们的实施可以在https://github.com/sc782/sbm-transformer中找到。
To overcome the quadratic cost of self-attention, recent works have proposed various sparse attention modules, most of which fall under one of two groups: 1) sparse attention under a hand-crafted patterns and 2) full attention followed by a sparse variant of softmax such as $α$-entmax. Unfortunately, the first group lacks adaptability to data while the second still requires quadratic cost in training. In this work, we propose SBM-Transformer, a model that resolves both problems by endowing each attention head with a mixed-membership Stochastic Block Model (SBM). Then, each attention head data-adaptively samples a bipartite graph, the adjacency of which is used as an attention mask for each input. During backpropagation, a straight-through estimator is used to flow gradients beyond the discrete sampling step and adjust the probabilities of sampled edges based on the predictive loss. The forward and backward cost are thus linear to the number of edges, which each attention head can also choose flexibly based on the input. By assessing the distribution of graphs, we theoretically show that SBM-Transformer is a universal approximator for arbitrary sequence-to-sequence functions in expectation. Empirical evaluations under the LRA and GLUE benchmarks demonstrate that our model outperforms previous efficient variants as well as the original Transformer with full attention. Our implementation can be found in https://github.com/sc782/SBM-Transformer .