Albert Musaelian1,Simon Batzner1,Nicola Molinari1,Boris Kozinsky1,2
Harvard University1,Robert Bosch LLC Research and Technology Center2
Albert Musaelian1,Simon Batzner1,Nicola Molinari1,Boris Kozinsky1,2
Harvard University1,Robert Bosch LLC Research and Technology Center2
Machine learning interatomic potentials promise to make large length- and time-scales accessible to atomistic simulations by approximating the potential energy surface from expensive quantum chemical methods. As in other machine learning problems, the central task is to develop models that learn to generalize in a robust manner beyond the data they were trained on. This challenge is even more pressing for molecular dynamics: incorrect force predictions on even a very small number of atoms in an equally small percentage of time steps can lead the system into unphysical configurations and result in unfaithful dynamics and simulation failure. The recent development of <i>E</i>(3)-equivariant models, exemplified by our NequIP [1] and Allegro [2] architectures, has led to significant demonstrated improvements in accuracy and generalization even with small training sets. In this talk, we will discuss recent efforts to improve the robustness of the accurate simulations NequIP and Allegro can drive at scale.<br/><br/>[1] Batzner, S., Musaelian, A., Sun, L. et al. E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nat Commun 13, 2453 (2022).<br/>[2] Musaelian, A. and Batzner, S., et. al. Learning Local Equivariant Representations for Large-Scale Atomistic Dynamics. arXiv:2204.05249