Apr 23, 2024
2:00pm - 2:30pm
Room 339, Level 3, Summit
Yoeri van de Burgt1
Eindhoven University of Technology1
The process of neural network training can be slow and energy-expensive due to the transfer of weight data between digital memory and processor chips. Neuromorphic systems can accelerate neural networks by performing multiply-accumulate operations in parallel using non-volatile analogue memory. However, the backpropagation training algorithm in multi-layer (deep) neural networks requires information - and thus storage - on the partial derivatives of the weight values, preventing easy implementation in hardware.
In this talk I will highlight a novel hardware implementation of the well-established backpropagation algorithm that progressively updates each layer using
in situ stochastic gradient descent, thus avoiding this storage requirement. We experimentally demonstrate the
in situ error calculation and the proposed progressive backpropagation method using a multi-layer hardware implemented neural network based on organic EC-RAM, and confirm identical learning characteristics and classification performance compared to conventional backpropagation in software.