Albert Musaelian1,Simon Batzner1,Boris Kozinsky1,2
Harvard University1,Robert Bosch Research and Technology Center2
Albert Musaelian1,Simon Batzner1,Boris Kozinsky1,2
Harvard University1,Robert Bosch Research and Technology Center2
Message Passing Neural Networks (MPNNs) have emerged as the leading paradigm for the construction of Machine Learning Interatomic Potentials. While MPNNs have repeatedly demonstrated excellent generalization error, they are minimally interpretable, are not systematically improvable, and are not amenable to parallelization. Here, we present DICE, the Deep Interatomic Cluster Expansion, an E(3)-equivariant deep learning potential that learns many-body information without the need for message passing or convolutions. We show that DICE can be systematically improved by including higher-order interactions, comes with physically meaningful architecture choices, and is trivial to parallelize. DICE leverages a novel, learnable E(3)-equivariant many-body representation built on weighted tensor products of geometric features. This many-body representation overcomes the combinatorial scaling of a complete cluster expansion and instead exhibits linear scaling in the number of simultaneously correlated atoms. We show that the use of higher-order correlations of atoms systematically improves the accuracy. We demonstrate high transferability to out-of-distribution data, investigate the learned energy decompositions, and discuss theoretical connections to existing work.