Christopher Andolina1,Wissam Saidi1
University of Pittsburgh1
Christopher Andolina1,Wissam Saidi1
University of Pittsburgh1
Machine learning potentials (MLPs) for atomistic simulations have an enormous prospective impact on materials modeling, offering orders of magnitude speedup over density functional theory (DFT) calculations without appreciably sacrificing accuracy in predicting material properties. We show that MLP-based material property predictions converge faster with respect to precision for Brillouin zone integrations than DFT-based property predictions for elemental systems. Further, we explore statistical error metrics to accurately determine <i>a prior</i> the precision level required of DFT training datasets for MLPs to ensure accelerated convergence of material property predictions, thus significantly reducing the computational expense of MLP development. We apply this approach to bimetallic systems where the generation of the DFT training data is substantially larger and thereby incurs higher computational costs. MLPs for the Al-Mg bimetallic system were generated based on this convergence acceleration MLP approach and validated with DFT reference values. The resulting MLPs reproduce elemental and intermetallic Al-Mg systems' general bulk properties such as point defects and non-ground state lattice configurations, mechanical properties, and various surface terminations for miller indices <4 with high fidelity. We use a hybrid Monte Carlo and molecular dynamics (MC/MD) from 200 to 800 K to model Mg surface segregation enthalpies in Al-Mg slabs (Mg = 0 to 20%). The Mg surface segregation enthalpies increase from (111) < (100) < (110), consistent with existing literature. Furthermore, we model the segregation tendencies obtained from the MC/MD simulations by adapting a recently introduced isotherm model for grain boundary segregation, which provides a more accurate model for Mg segregation behavior on surfaces. Our accelerated convergence approach yields accurate and robust material predictions with lower precision DFT training data, providing a more efficient workflow for training MLP atomistic potentials.