MRS Meetings and Events

 

DS04.12.08 2023 MRS Fall Meeting

Robustness of Local Predictions in Machine Learning Models for Materials

When and Where

Nov 30, 2023
4:15pm - 4:30pm

Sheraton, Second Floor, Back Bay B

Presenter

Co-Author(s)

Sanggyu Chong1,Federico Grasselli1,Chiheb Ben Mahmoud1,Joe Morrow2,Volker Deringer2,Michele Ceriotti1

EPFL1,University of Oxford2

Abstract

Sanggyu Chong1,Federico Grasselli1,Chiheb Ben Mahmoud1,Joe Morrow2,Volker Deringer2,Michele Ceriotti1

EPFL1,University of Oxford2
Many machine learning (ML) models for the prediction of material properties rely on a decomposition of the global target quantity (e.g. total energy, magnetic dipole, electronic density of states, etc.) into local contributions associated with individual atoms, or clusters of atoms. By interpreting and learning the global quantities as a sum of local contributions, the models are essentially trained to make predictions on the locally decomposed environments of a given structure. This approach is computationally convenient, significantly improves the transferability and scalability of the resulting ML models, and allows one to discover useful structure–property relationships as they associate simple atomic motifs with complicated macroscopic properties. While the practical benefits of the local decomposition is clear, one must recognize that only the global quantity is rigorously defined, and the decomposition can arbitrarily take place in numerous different ways. It hence remains largely unclear to what extent the local predictions of the ML model can be trusted.<br/><br/>In this contribution, we introduce the local prediction rigidity (LPR),[1] a generally applicable metric that quantifies how “rigid”, or robust, the ML model is in the local predictions that it makes. Through our investigation on a range of models and materials datasets, we reveal that the LPR can vary drastically between different local predictions, and it largely depends on the degeneracies associated with locally decomposing the global target quantity. We then investigate strategies to systematically increase the LPR by the careful curation of the training dataset, and demonstrate these strategies in linear, kernel, and neural network models for silicon, carbon and GaAs. The LPR has great potential for proposing effective active learning schemes, where the assessment and improvement of the ML model performance can all take place at the local level. We also note that, as the derivation of LPR is not limited to an atomic decomposition, it can be extended to other decomposition schemes: multiple body-order decomposition, short-range vs. long-range decomposition, etc. This allows one to scrutinize the model and precisely identify where it lacks robustness, and propose effective ways to improve the ML model.<br/><br/>[1] Sanggyu Chong, Federico Grasselli, Chiheb Ben Mahmoud, Joe D. Morrow, Volker L. Deringer, Michele Ceriotti, "Robustness of Local Predictions in Atomistic Machine Learning Models", <i>In preparation</i>.

Symposium Organizers

Andrew Detor, GE Research
Jason Hattrick-Simpers, University of Toronto
Yangang Liang, Pacific Northwest National Laboratory
Doris Segets, University of Duisburg-Essen

Symposium Support

Bronze
Cohere

Publishing Alliance

MRS publishes with Springer Nature