MRS Meetings and Events

 

DS04.12.04 2023 MRS Fall Meeting

An Intrusive, Bayesian Paradigm for Scientific Machine Learning Outperforms Neural Networks in Typical Scientific Modeling Contexts

When and Where

Nov 30, 2023
2:30pm - 2:45pm

Sheraton, Second Floor, Back Bay B

Presenter

Co-Author(s)

David Mebane1,2

West Virginia University1,KBR Wyle Services2

Abstract

David Mebane1,2

West Virginia University1,KBR Wyle Services2
Neural networks (NNs) are powerful tools for machine learning, with stunning results in computer vision and large language models dominating the news. However NNs are often misapplied in scientific modeling contexts, in which the input-output space dimensionalities are small to moderate. There are numerous examples of other methods such as decision trees and Gaussian processes (GPs) outperforming NNs in both accuracy and inference speed for tabular estimation. A shift away from NNs in scientific modeling contexts promises faster and more accurate performance. A framework for scientific machine learning in which fast, decomposed GPs represent well-defined scientific functions has shown considerable promise, outperforming recurrent neural networks (such as LSTM) on multiple benchmark dynamic modeling tasks. These well-defined functions also present opportunities for multi-scale modeling, linking electronic and atomistic scales to device scales. Multiple applications in materials modeling for energy applications -- including in solid-state batteries and high-temperature CO2 electrolyzers -- will be presented.

Symposium Organizers

Andrew Detor, GE Research
Jason Hattrick-Simpers, University of Toronto
Yangang Liang, Pacific Northwest National Laboratory
Doris Segets, University of Duisburg-Essen

Symposium Support

Bronze
Cohere

Publishing Alliance

MRS publishes with Springer Nature