Apr 22, 2024
4:30pm - 4:45pm
Room 320, Level 3, Summit
Pratik Brahma1,Krishnakumar Bhattaram1,Jack Broad2,Sinead Griffin2,Sayeef Salahuddin1,2
University of California, Berkeley1,Lawrence Berkeley National Laboratory2
Pratik Brahma1,Krishnakumar Bhattaram1,Jack Broad2,Sinead Griffin2,Sayeef Salahuddin1,2
University of California, Berkeley1,Lawrence Berkeley National Laboratory2
Modern microelectronic devices such as silicon transistors are composed of multiple material interfaces whose electronic interactions significantly affect electron transport and, ultimately, device performance. These complex interfacial interactions between various materials of crystalline, polycrystalline, and amorphous phases, as well as boundary effects such as quantum confinement, present a significant challenge to the atomistic modeling of electronic devices. Traditional modeling approaches often use ab-initio methods such as density functional theory for molecular dynamics and electronic structure calculations. However, these methods scale poorly with increasing system size as they rely on diagonalization of the quantum Hamiltonian, posing a challenge for fast and accurate simulations of electronic devices that contain thousands of atoms and various structural phases.<br/>In this work, we demonstrate the viability of graph neural networks (GNNs) [1, 2] to overcome this scaling challenge. Our architecture is designed to accelerate ab-initio molecular dynamics and electronic structure calculations by characterizing the relationships between local chemical environments and global electronic transport properties such as carrier injection velocity and gate capacitance. For given macroscopic transistor dimensions, we use our neural network predicted atomic forces to generate the heterogeneous crystalline silicon channel–dielectric gate stack (amorphous SiO<sub>2</sub>-amorphous HfO<sub>2</sub>), with bond lengths within 3% of experimental values. Furthermore, we predict global electronic (density of states) and transport properties (injection velocity) of crystalline nanoslab silicon channels, showing good agreement—within 0.18% for density of states and 5.4% for injection velocity—with the baseline material simulation model for channel thicknesses outside the training domain, while also accelerating the simulation speed by four orders of magnitude. The obtained accuracies demonstrate that our neural network captures the structural confinement effects of thin silicon channels in advanced transistors. Overall, the scalability and accuracy of our predictions over a wide range of material and transport properties demonstrate the efficacy of GNNs to accelerate advanced electronic device simulations, paving the way for rapid design and modeling of next-generation transistors.<br/><br/>[1] Schütt, K. T., Sauceda, H. E., Kindermans, P. J., Tkatchenko, A., & Müller, K. R. (2018). SchNet - A deep learning architecture for molecules and materials. <i>Journal of Chemical Physics</i>, <i>148</i>(24). https://doi.org/10.1063/1.5019779<br/>[2] Unke, O. T., Chmiela, S., Gastegger, M., Schütt, K. T., Sauceda, H. E., & Müller, K. R. (2021). SpookyNet: Learning force fields with electronic degrees of freedom and nonlocal effects. <i>Nature Communications</i>, <i>12</i>(1). https://doi.org/10.1038/s41467-021-27504-0