Apr 23, 2024
5:00pm - 7:00pm
Flex Hall C, Level 2, Summit
Hongwoon Yun1,Woojong Yu1
Sungkyunkwan University1
The von Neumann architecture, which physically separates the CPU and memory devices, has been dominant for a long time. However, it has limitation in computational speed due to bottlenecks and waste a large amount of energy. To address the energy and speed issue, we made a neuromorphic system based on au nanoparticle floating gate memristor (AuNp-FGM).<br/>Our memristor, utilizing graphene as floating gate, MoS<sub>2</sub> as channel, have been recognized for its excellent performance [1,2]. Also, it has been recognized as one of the most promising candidates for neuromorphic system [3]. In this study, by forming au nanoparticles between floating gate and tunneling oxide, a two-terminal memristor which operates in the ±3V region is fabricated. Additionally, we made a neuromorphic array, calculated the energy used for learning simulation.<br/>Using the change of the Fermi energy level (E<sub>f</sub>) of graphene, (AuNp-FGM) exhibits memory characteristic. The device exhibits high on/off ratio over than 10<sup>6</sup>, retention more than 9 hours and robust endurance more than 80,000 times. AuNp-FGM also showed low cycle to cycle variability of C<sub>v</sub> = 3.6% (n = 90, C<sub>v</sub> = , = standard deviation, = mean value). Furthermore, it showed excellent linearity, indicating applicability to neuromorphic systems. For 100 level potentiation (+4V, 0.5s), non-linearity factor ranges from 0.1 to 0.6. For 100 level depression (-3V, 0.2s), it ranges from 2.3 to 4.6 (n = 15). A similar trend was shown even when the number of input pulses was changed (50, 100, 200, 300, 400 inputs).<br/>Based on AuNp-FGM, we fabricated a neuromorphic array consisting of 2 neurons and 32 synapses. Three types of data (horizontal, vertical, diagonal) were used in 40 learning simulation. The total energy consumed was 70uJ, which confirmed to be a 97% energy reduction compared to the previous experiment [3].<br/><br/>References<br/>[1] [1] Vu, Q., Shin, Y., Kim, Y. <i>et al.</i> <i>Nat. Commun.</i> <b>7</b>, 12725 (2016).<br/>[2] Q. A. Vu, H. Kim, V. L. Nguyen, U. Y. Won, S. Adhikari, K. Kim, Y. H. Lee, W. J. Yu, <i>Adv. Mater.</i> 2017, 29, 1703363.<br/>[3] Won, U.Y., An Vu, Q., Park, S.B. <i>et al.</i> <i>Nat. Commun.</i> <b>14</b>, 3070 (2023).