Yutaka Ohno1,Takeshi Watanabe1,Koki Tatsumi1,Atsushi Kawaguchi1,Adha Sukma Aji1
Nagoya University1
Yutaka Ohno1,Takeshi Watanabe1,Koki Tatsumi1,Atsushi Kawaguchi1,Adha Sukma Aji1
Nagoya University1
The increasing information needed to be processed in the artificial intelligence (AI) era requires a breakthrough in a novel computing paradigm. Brain-inspired computing using hardware-based synaptic devices, such as memristors, is gaining attention due to its ability to mimic the biological synapse connections. Reservoir computing is a novel neural network computing method that has advantages in processing a high amount of data at a short computing time and low training cost while maintaining low power consumption. Here, we report physical reservoirs based on carbon nanotube (CNT) memory devices, such as memristors and thin-film transistors (TFTs) with hysteresis. First, CNT/HfO<sub>2</sub>/CNT memristors were fabricated by sandwiching a thin HfO<sub>2</sub> insulator between two CNT network films. Ultra-dense memristors can be obtained by connecting multiple electrodes to the bottom and top CNT films. Each combination of two electrodes can act as a memristor because the current path will be different for each due to the network structure of the CNT films. We demonstrated Pavlov's dog training regime to the system. We also explored the potential of CNT TFTs with hysteresis as a memory device for reservoir computing. NARMA (nonlinear autoregressive moving average) tasks have been demonstrated by using CNT TFTs with multiple nodes.