MRS Meetings and Events

 

DS01.11.02 2022 MRS Spring Meeting

Machine Learning the Scaling Property of Density Functionals via Data Augmentation

When and Where

May 23, 2022
7:35pm - 7:50pm

DS01-Virtual

Presenter

Co-Author(s)

Weiyi Gong1,Tao Sun2,Peng Chu1,Hexin Bai1,Anoj Aryal1,Shah Tanvir-Ur-Rahman Chowdhury1,Jie Yu1,Haibin Ling2,John Perdew1,Qimin Yan1

Temple University1,Stony Brook University, The State University of New York2

Abstract

Weiyi Gong1,Tao Sun2,Peng Chu1,Hexin Bai1,Anoj Aryal1,Shah Tanvir-Ur-Rahman Chowdhury1,Jie Yu1,Haibin Ling2,John Perdew1,Qimin Yan1

Temple University1,Stony Brook University, The State University of New York2
Density functional theory (DFT) has become the standard method to study electronic property of materials in physics, chemistry, and material science. Recently, machine learning (ML) has been applied to parametrize exchange-correlation (XC) functionals without domain knowledge of human by using kernel ridge regression, fully connected neural networks (NNs) and convolutional neural networks (CNNs). Physical XC functionals must satisfy several exact conditions, such as coordinate scaling, spin scaling and derivative discontinuity. However, these exact conditions have not been incorporated implicitly into the machine learning modeling and pre-processing on large material datasets. In this work, we propose a schematic approach to incorporate a given physical constraint as a data augmentation into learning framework design, if the constraint is defined by an equality. Specifically, we trained a 3D CNN model on augmented molecular density dataset which was generated by using the scaling property of exchange energy functionals based on the scaling factors chosen. We found that the model trained on constraint-augmented dataset predicts exchange energies that satisfy the scaling relation, while the model trained on unaugmented dataset give poor predictions for the scaling-transformed electron density systems. This shows that incorporating exact constraints as a data augmentation method can enhance the understanding of DFT theory for neural network models and generalize the application of NN-based XC functionals in a wide range of scenarios which are not always available experimentally but theoretically justified.

Symposium Organizers

Mathieu Bauchy, University of California, Los Angeles
Mathew Cherukara, Argonne National Laboratory
Grace Gu, University of California, Berkeley
Badri Narayanan, University of Louisville

Publishing Alliance

MRS publishes with Springer Nature