December 1 - 6, 2024
Boston, Massachusetts
Symposium Supporters
2024 MRS Fall Meeting & Exhibit
MT04.11.09

Hybrid Equivariant Graph Convolutional Neural Networks for Large Scale Atomistic Simulations

When and Where

Dec 5, 2024
4:30pm - 4:45pm
Hynes, Level 2, Room 210

Presenter(s)

Co-Author(s)

Alex Kutana1,Koji Shimizu2,Satoshi Watanabe3,Ryoji Asahi1

Nagoya University1,National Institute of Advanced Industrial Science and Technology2,The University of Tokyo3

Abstract

Alex Kutana1,Koji Shimizu2,Satoshi Watanabe3,Ryoji Asahi1

Nagoya University1,National Institute of Advanced Industrial Science and Technology2,The University of Tokyo3
Regression with machine learning models offers a fast and accurate pathway to material property predictions. Equivariant physics-informed neural networks enable faster learning with smaller datasets by imposing appropriate symmetry constraints on the network weights. In this work, we evaluate prediction errors, learning rates, and generalization ability of several architectures of equivariant graph convolutional neural networks (EGCNNs), targeting atom-wise and global tensorial material properties, and report state-of-the-art performance.<br/>Atomic structures are represented by graphs, with atoms mapped to graph nodes and neighbor connections to edges, each assigned attributes and features. The core operation is convolution (or, more generally, message passing) on a graph. Node features are updated via message passing from the neighboring nodes along the edges in the convolution block, with messaging and update functions tuned for flexibility and accuracy of representation. The models are trained for end-to-end prediction of tensorial properties from structural input, with features, weights, and outputs conforming to equivariance with respect to the symmetries of the three-dimensional Euclidean space.<br/>We introduce Equivar, a simple EGCNN model for predicting tensors of Born effective charges from the atomic structure. The model is trained with ab initio datasets of oxides, including systems with mixed covalent and ionic bonding. The model shows relative errors of a fraction of a percent to few percent depending on dataset, and linear scaling of computational time with the number of atoms for both training and inference. The training times are 0.493 ms/atom/epoch, and the inference times are 0.340 ms/atom on a single NVIDIA RTX A6000 GPU. We analyze the learning curves and report the exponents of the empirical power law. Generalization error is seen to decrease faster for small datasets; however, power law scaling continues for the largest dataset size tested (~10<sup>6</sup> atoms), albeit with a smaller exponent.<br/>With accuracy comparable to that of traditional ab initio approaches, Equivar offers a much better scaling and faster evaluation, affording access to larger length and time scales, which allows one to simulate diverse physical phenomena. While the target properties of primary interest are atomic Born effective charges and static dielectric tensors, other direction-depended properties can also be targeted using this universal approach. Combining different models offers a possibility of simultaneously evaluating various properties at a low computational cost. In on-the-fly calculation scenarios, energies and forces are obtained from an EGCNN force field, while other properties (e.g., tensors of Born effective charges) are calculated simultaneously with a different EGCNN. Alternatively, a single hybrid model can be employed for multiple property predictions. We implement and discuss the advantages and disadvantages of each of these approaches, including simplicity of implementation and possibility of model merging and transfer learning.

Symposium Organizers

Kjell Jorner, ETH Zurich
Jian Lin, University of Missouri-Columbia
Daniel Tabor, Texas A&M University
Dmitry Zubarev, IBM

Session Chairs

Jian Lin
Dmitry Zubarev

In this Session