December 1 - 6, 2024
Boston, Massachusetts
Symposium Supporters
2024 MRS Fall Meeting & Exhibit
MT02.08.07

Deep Learning Assisted Hybrid Metrology of Nanosheet Transistors for Fast Film Characterization

When and Where

Dec 3, 2024
8:00pm - 10:00pm
Hynes, Level 1, Hall A

Presenter(s)

Co-Author(s)

Tao Cai1,Yifei Li1,Daniel Schmidt2,Rafael Jaramillo1

Massachusetts Institute of Technology1,IBM Research-Albany2

Abstract

Tao Cai1,Yifei Li1,Daniel Schmidt2,Rafael Jaramillo1

Massachusetts Institute of Technology1,IBM Research-Albany2
Gate-all-around (GAA) nanosheet field effect transistors (FETs) are on the cusp of emerging into household products and leading another leap forward in computing. In order to achieve commercial implementation, in-line characterization improvements are essential as critical dimensions get smaller and require higher accuracy. This is especially true for GAA nanosheets where the three-dimensional geometries, multi-step processing, and need for precise control may require real-time monitoring during manufacturing. In particular, the formation of the stacked nanosheet governs many of the device critical dimensions such as the nanosheet dimensions and the pitch. However, there are few techniques that alone can provide multi-stack dimensions and compositions in a fast and non-destructive manner. Here we present a framework for combining multiple metrology techniques, known as hybrid metrology, alongside machine learning methods to improve the accuracy and speed of data analysis for thin film characterization.<br/><br/>We demonstrate this idea by using a deep learning algorithm for combined analysis of both spectroscopic ellipsometry (SE) and Raman spectroscopy data to enable characterization of critical device parameters in multilayer Si and Si(1-x)Ge(x) thin-films for fabrication of GAA nanosheet FETs. We employ a composite neural network that consists of a convolutional neural network to analyze SE spectra and a multilayer perceptron component to analyze the Raman data. Over one million SE spectra and Raman peaks are simulated as a training dataset using optical property databases from literature, and both simulated and experimental test datasets are used to confirm the accuracy of the neural network. This method was first utilized to study a simple two-layer stack of Si and SiGe where we demonstrated that only with the incorporation of Raman data, the neural network correctly predicts the critical dimensions from a single SE spectrum and with a mean absolute error of 1%. We then extended this method to a full eight-layer stack of alternating Si and SiGe with varying thicknesses and compositions of SiGe. The inclusion of Raman data successfully improved the accuracy of the neural network predicted parameters by up to 50% in the full nanosheet stack and was able to accurately predict parameters with a mean absolute error of less than 5% within milliseconds per stack. We believe these improvements are a result of the Raman data constraining the massive parameter space of SE that usually makes its analysis difficult without human input. Here we demonstrate the success of this method for silicon GAA nanosheet FETs, but it is applicable to any multi-stack thin film where the optical properties are known. These results are promising for future automation of rapid and non-destructive in-line characterization in commercial mass production settings.

Keywords

Raman spectroscopy | spectroscopy

Symposium Organizers

Andi Barbour, Brookhaven National Laboratory
Lewys Jones, Trinity College Dublin
Yongtao Liu, Oak Ridge National Laboratory
Helge Stein, Karlsruhe Institute of Technology

Session Chairs

Andi Barbour
Lewys Jones
Yongtao Liu

In this Session