December 1 - 6, 2024
Boston, Massachusetts
Symposium Supporters
2024 MRS Fall Meeting & Exhibit
MT04.11.08

LightPFP—Accelerating the Development of Task-Specific Machine Learning Potentials Using Universal Potential

When and Where

Dec 5, 2024
4:15pm - 4:30pm
Hynes, Level 2, Room 210

Presenter(s)

Co-Author(s)

Wenwen Li1,Nontawat Charoenphakdee1,Yuta Tsuboi1,So Takamoto1,Ju Li2

Preferred Networks, Inc.1,Massachusetts Institute of Technology2

Abstract

Wenwen Li1,Nontawat Charoenphakdee1,Yuta Tsuboi1,So Takamoto1,Ju Li2

Preferred Networks, Inc.1,Massachusetts Institute of Technology2
Machine learning interatomic potential has emerged as a powerful tool in materials research. While yielding accuracy comparable to ab initio calculations, machine learning force fields significantly enhance computational efficiency. Two categories of machine learning potentials (MLPs), universal and task-specific, have been developed to address different needs.<br/>Universal MLPs, such as PFP, CHGNet, and M3GNet, encompass a wide range of materials, eliminating the need to train models for specific purposes. However, the model size of universal potentials tends to be very large, which makes them computationally inefficient and prohibitive to be applied on large scale simulation. Conversely, task-specific MLPs, including moment tensor potential (MTP) and Allegro, focus on specific materials and applications, necessitating training for each use case. By sacrificing universality, these models tend to be smaller and much faster to compute. Nevertheless, the training procedure is often time-consuming (e.g., weeks or months) because ab-initio calculation of training datasets is typically required.<br/>To overcome the issue of inefficiency in task-specific MLP training procedure, we propose LightPFP, which is a method to train task-specific MLPs leveraging PFP as a universal MLP. By substituting ab-initio calculations with universal MLP to generate training datasets, we can significantly accelerate the model training process to be completed within a few hours. We have verified the discrepancy between LightPFP and PFP to be within a margin of a few meV/atom, ensuring high accuracy. As a result, LightPFP enables the simulation of larger structures with near-DFT accuracy, surpassing the capacity of PFP.<br/>Furthermore, to enhance the stability and expand the applicability of generated machine learning potentials, several techniques have been developed, including model pre-training/fine-tuning, transfer learning to different exchange-correlation functionals, uncertainty estimation, and active learning. Preliminary experiment shows that such techniques can improve the reliability and predictive power of machine learning potentials, facilitating their application in various materials research domains.

Symposium Organizers

Kjell Jorner, ETH Zurich
Jian Lin, University of Missouri-Columbia
Daniel Tabor, Texas A&M University
Dmitry Zubarev, IBM

Session Chairs

Jian Lin
Dmitry Zubarev

In this Session