MRS Meetings and Events

 

EL20.07.04 2023 MRS Fall Meeting

Integration of Ag-CBRAM Crossbars and Mott ReLU Neurons for Efficient Implementation of Deep Neural Networks in Hardware

When and Where

Nov 29, 2023
4:30pm - 5:00pm

Hynes, Level 3, Room 301

Presenter

Co-Author(s)

Jaeseoung Park1,Duygu Kuzum1,Yuhan Shi1,Sangheon Oh1

University of California, San Diego1

Abstract

Jaeseoung Park1,Duygu Kuzum1,Yuhan Shi1,Sangheon Oh1

University of California, San Diego1
In-memory computing with emerging non-volatile memory devices (eNVMs) has shown promising results in accelerating matrix-vector multiplications (MVMs). However, activation function calculations are still being implemented with general processors or large and complex neuron peripheral circuits. Here, we present the integration of Ag-based conductive bridge random access memory (Ag-CBRAM) crossbar arrays with Mott ReLU activation neurons for scalable, energy and area-efficient hardware implementation of deep neural networks (DNNs). We develop Ag-CBRAM devices that can achieve a high ON/OFF ratio and multi-level programmability. Compact and energy-efficient Mott ReLU neuron devices implementing rectified linear unit (ReLU) activation function are directly connected to the columns of Ag-CBRAM crossbars to compute the output from the weighted sum current. We implement convolution filters and activations for VGG-16 using our integrated hardware and demonstrate the successful generation of feature maps for CIFAR-10 images in hardware. Our approach paves a new way toward building a highly compact and energy-efficient eNVMs-based in-memory computing system.

Symposium Organizers

Gina Adam, George Washington University
Sayani Majumdar, Tampere University
Radu Sporea, University of Surrey
Yiyang Li, University of Michigan

Symposium Support

Bronze
APL Machine Learning | AIP Publishing

Publishing Alliance

MRS publishes with Springer Nature