论文标题
晶体材料属性预测的周期性图形变压器
Periodic Graph Transformers for Crystal Material Property Prediction
论文作者
论文摘要
我们考虑在编码晶体材料的周期图上的表示形式学习。与常规图不同,周期图由最小单位单元组成,该单元在3D空间中的常规晶格上重复出现。如何有效编码这些周期性结构会带来常规图表示学习中不存在的独特挑战。除了E(3)不变外,周期性的图表表示还需要定期不变。也就是说,学到的表示形式应该是人为强加的细胞边界的变化。此外,需要明确捕获周期性重复模式,因为不同尺寸和方向的格子可能对应于不同的材料。在这项工作中,我们提出了一个变压器体系结构,称为Matformer,以进行周期性的图表学习。我们的贴合器设计为周期性不变,可以明确捕获重复模式。特别是,Matformer通过有效使用相邻细胞中相同原子之间的几何距离来编码周期性模式。多个常见基准数据集的实验结果表明,我们的配合器的表现始终优于基线方法。此外,我们的结果证明了定期不变性和对晶体表示学习的明确重复模式编码的重要性。
We consider representation learning on periodic graphs encoding crystal materials. Different from regular graphs, periodic graphs consist of a minimum unit cell repeating itself on a regular lattice in 3D space. How to effectively encode these periodic structures poses unique challenges not present in regular graph representation learning. In addition to being E(3) invariant, periodic graph representations need to be periodic invariant. That is, the learned representations should be invariant to shifts of cell boundaries as they are artificially imposed. Furthermore, the periodic repeating patterns need to be captured explicitly as lattices of different sizes and orientations may correspond to different materials. In this work, we propose a transformer architecture, known as Matformer, for periodic graph representation learning. Our Matformer is designed to be invariant to periodicity and can capture repeating patterns explicitly. In particular, Matformer encodes periodic patterns by efficient use of geometric distances between the same atoms in neighboring cells. Experimental results on multiple common benchmark datasets show that our Matformer outperforms baseline methods consistently. In addition, our results demonstrate the importance of periodic invariance and explicit repeating pattern encoding for crystal representation learning.